Will the next gen consoles be powerful enough...for HD?
Moderator: Thanas
Will the next gen consoles be powerful enough...for HD?
Here's something that struck me...
The next gen consoles are supposed to be, what, 10x more powerful than current generation at most (excluding the ultrahyped PS3, which we can't tell for sure).
It's been reported that the next gen consoles will all support HD resolution for games.
I had assumed this was a given.
But then I stopped. And I thought about it. And I thought...
WOAH.
Normal TVs are 640x480 resolution, correct?
HDTV's are 1,920x1,080.
Normal TV's have a total of 307,200 pixels.
HDTV's have a total of 2,073,600 pixels.
That's...6.75 times as many pixels!
That would be a HUGE limiting factor- the console may be 10x more powerful, but it has to render nearly 7x as much on the screen at once
Could this mar the next generation? Could it mean that game developers have a choice of:
A) Supporting HD resolutions, but having not much more detail than current gen consoles
B) Only allowing 640x480 resolutions and not supporting HD, but having way better graphics
C) Putting it as a user option, where they get WAY more detail if they turn off HD?
Or is there something I'm missing?
The next gen consoles are supposed to be, what, 10x more powerful than current generation at most (excluding the ultrahyped PS3, which we can't tell for sure).
It's been reported that the next gen consoles will all support HD resolution for games.
I had assumed this was a given.
But then I stopped. And I thought about it. And I thought...
WOAH.
Normal TVs are 640x480 resolution, correct?
HDTV's are 1,920x1,080.
Normal TV's have a total of 307,200 pixels.
HDTV's have a total of 2,073,600 pixels.
That's...6.75 times as many pixels!
That would be a HUGE limiting factor- the console may be 10x more powerful, but it has to render nearly 7x as much on the screen at once
Could this mar the next generation? Could it mean that game developers have a choice of:
A) Supporting HD resolutions, but having not much more detail than current gen consoles
B) Only allowing 640x480 resolutions and not supporting HD, but having way better graphics
C) Putting it as a user option, where they get WAY more detail if they turn off HD?
Or is there something I'm missing?
Take a look at the resolutions offered by PC gaming rigs sometime. They're easily hitting 1600x1200 at high frame rates in most games, and would probably stretch higher if monitors that supported higher resolutions were available.
The next gen consoles are using next generation PC graphics chips (the XBox Next, for example, is using a graphics chip based on ATi's R500)
They can already render more pixels than they do, the Xbox supports 1080i already, albeit not many games take advantage of this, the TV's resolution is already a bottleneck for consoles.
The next gen consoles are using next generation PC graphics chips (the XBox Next, for example, is using a graphics chip based on ATi's R500)
They can already render more pixels than they do, the Xbox supports 1080i already, albeit not many games take advantage of this, the TV's resolution is already a bottleneck for consoles.
Looking at this:Vendetta wrote:Take a look at the resolutions offered by PC gaming rigs sometime. They're easily hitting 1600x1200 at high frame rates in most games, and would probably stretch higher if monitors that supported higher resolutions were available.
The next gen consoles are using next generation PC graphics chips (the XBox Next, for example, is using a graphics chip based on ATi's R500)
They can already render more pixels than they do, the Xbox supports 1080i already, albeit not many games take advantage of this, the TV's resolution is already a bottleneck for consoles.
http://graphics.tomshardware.com/graphi ... ts-08.html
Far Cry, max quality settings and AA, 1600x1200 (slightly less than HD), on a brand new $600 graphics card...gets 32.4 FPS.
Thats 2 FPS more than the 30 FPS magic number. That's not easily making high frame rates.
I'm hoping next gen consoles are going to do better than that, but the XBox 2 is supposed to be coming out before the end of 2005
- Admiral Valdemar
- Outside Context Problem
- Posts: 31572
- Joined: 2002-07-04 07:17pm
- Location: UK
There are various "forms" of HD. 1080 refers to 1080 interlaced, I believe... which means it's actually 540X960, with each individual "field" only covering half the screen (alternating lines make one field... two fields make one frame).
Then there's 720P, which is true 720X1080 or something along those lines.
Needless to say, there's more to it than simple pixel count. Besides, anything above a rough equivalent of 1024X768 ought to be plenty, especially if AA is used.
EDIT:
Then there's 720P, which is true 720X1080 or something along those lines.
Needless to say, there's more to it than simple pixel count. Besides, anything above a rough equivalent of 1024X768 ought to be plenty, especially if AA is used.
EDIT:
The Congress of the United States disagrees with you.There's no point getting a HDTV given the price if there's jack all broadcasting in it and games aren't all looking into it yet.
The Great and Malignant
They have a $400 27" HDTV over at the local Best Buy. My sister works there, I can pick it up for $200 if I decide to and she makes the purchaseAdmiral Valdemar wrote:Screw the TV. I'll just get a decent CRT and use that instead. There's no point getting a HDTV given the price if there's jack all broadcasting in it and games aren't all looking into it yet.
Last edited by Praxis on 2005-03-02 07:07pm, edited 1 time in total.
That makes sense.SPOOFE wrote:There are various "forms" of HD. 1080 refers to 1080 interlaced, I believe... which means it's actually 540X960, with each individual "field" only covering half the screen (alternating lines make one field... two fields make one frame).
Then there's 720P, which is true 720X1080 or something along those lines.
Needless to say, there's more to it than simple pixel count. Besides, anything above a rough equivalent of 1024X768 ought to be plenty, especially if AA is used.
So developers would probably use an HD format equivilant to most computer monitors rather than the highest HD res possible.
The one thing I find really cool about HD is that if you're playing a four-player split screen game, each split has about as much space as a normal TV
- Ace Pace
- Hardware Lover
- Posts: 8456
- Joined: 2002-07-07 03:04am
- Location: Wasting time instead of money
- Contact:
I prefer looking at this http://anandtech.com/video/showdoc.aspx?i=2278&p=3Praxis wrote:
Looking at this:
http://graphics.tomshardware.com/graphi ... ts-08.html
Far Cry, max quality settings and AA, 1600x1200 (slightly less than HD), on a brand new $600 graphics card...gets 32.4 FPS.
Thats 2 FPS more than the 30 FPS magic number. That's not easily making high frame rates.
I'm hoping next gen consoles are going to do better than that, but the XBox 2 is supposed to be coming out before the end of 2005
1600, gets high FPS.
Brotherhood of the Bear | HAB | Mess | SDnet archivist |
No, 1080i has 1080 scanlines but only one field is drawn at a time. Don't think pixels when it comes to television, think scanlines. IIRC there is 1080p but that's fairly rare and takes a huge amount of bandwidth. I've seen HDCAM footage of it, though, very nice.SPOOFE wrote:There are various "forms" of HD. 1080 refers to 1080 interlaced, I believe... which means it's actually 540X960, with each individual "field" only covering half the screen (alternating lines make one field... two fields make one frame).
720 scanlines drawn simultanously, yes. Smoother image but inferior resolution compared to 1080i.Then there's 720P, which is true 720X1080 or something along those lines.
The FCC is forcibly phasing out NTSC and replacing it with ATSC. Some HDTV sets might not actually be capable of displaying HDTV resolution (that is, a minimum of 720p) -- they may display SDTV (480i) or EDTV (480p).The Congress of the United States disagrees with you.
- The Cleric
- BANNED
- Posts: 2990
- Joined: 2003-08-06 09:41pm
- Location: The Right Hand Of GOD
And how much did you pay to get that quality again?Ace Pace wrote:I prefer looking at this http://anandtech.com/video/showdoc.aspx?i=2278&p=3Praxis wrote:
Looking at this:
http://graphics.tomshardware.com/graphi ... ts-08.html
Far Cry, max quality settings and AA, 1600x1200 (slightly less than HD), on a brand new $600 graphics card...gets 32.4 FPS.
Thats 2 FPS more than the 30 FPS magic number. That's not easily making high frame rates.
I'm hoping next gen consoles are going to do better than that, but the XBox 2 is supposed to be coming out before the end of 2005
1600, gets high FPS.
{} Thrawn wins. Any questions? {} Great Dolphin Conspiracy {} Proud member of the defunct SEGNOR {} Enjoy the rythmic hip thrusts {} In my past life I was either Vlad the Impaler or Katsushika Hokusai {}
Yes.
But because half of the scanlines are always static, an interlaced picture will always look smoother than a progressive one, both in terms of flickering and in terms of smoothness of motion. (Compare your TV picture, which is interlaced, with a monitor at 60Hz, and see the difference, the monitor has visible flicker, the TV much less, even on a pure white screen)
But because half of the scanlines are always static, an interlaced picture will always look smoother than a progressive one, both in terms of flickering and in terms of smoothness of motion. (Compare your TV picture, which is interlaced, with a monitor at 60Hz, and see the difference, the monitor has visible flicker, the TV much less, even on a pure white screen)
This is true, but it doesn't reduce the graphics hardware requirements, because each field is still drawn 30 times per second. In other words, instead of 30 full frames per second, you're putting out 60 half-frames per second.SPOOFE wrote:There are various "forms" of HD. 1080 refers to 1080 interlaced, I believe... which means it's actually 540X960, with each individual "field" only covering half the screen (alternating lines make one field... two fields make one frame)
Umm, interlaced images are well known for artifacts which are non-existent in progressive (line twitter, feathering); this is especially true for motion. Also, on a progressive display, each pixel is being refreshed 60 times per second, twice as often as interlaced's 30.Vendetta wrote:But because half of the scanlines are always static, an interlaced picture will always look smoother than a progressive one, both in terms of flickering and in terms of smoothness of motion. (Compare your TV picture, which is interlaced, with a monitor at 60Hz, and see the difference, the monitor has visible flicker, the TV much less, even on a pure white screen)
This is the entire reason why monitors are non-interlaced: at the typical viewing distance for computer monitors (<1m), the interlacing artifacts would be intolerable.
- Vertigo1
- Defender of the Night
- Posts: 4720
- Joined: 2002-08-12 12:47am
- Location: Tennessee, USA
- Contact:
I say screw both HD and the CRT. Make the fuckers with a DVI output and run that to a bigscreen.Admiral Valdemar wrote:Screw the TV. I'll just get a decent CRT and use that instead. There's no point getting a HDTV given the price if there's jack all broadcasting in it and games aren't all looking into it yet.
"I once asked Rebecca to sing Happy Birthday to me during sex. That was funny, especially since I timed my thrusts to sync up with the words. And yes, it was my birthday." - Darth Wong
Leader of the SD.Net Gargoyle Clan | Spacebattles Firstone | Twitter
Leader of the SD.Net Gargoyle Clan | Spacebattles Firstone | Twitter
- The Kernel
- Emperor's Hand
- Posts: 7438
- Joined: 2003-09-17 02:31am
- Location: Kweh?!
Next-gen consoles will support HD resolutions across the board, probably in 720p mode for most games. The main factor in pumping higher resolutions in memory bandwidth and these next-gen boxes will have bandwidth to burn for precicely this reason.
1080i is trickier; a GPU isn't going to render games internally in 1080i, they will have to do it at 1080p, then convert them. This means that the resolution would be higher then most games are played today, even on high end cards. I imagine that developers will have a choice of resolutions to use, but 720p will be the most common.
1080i is trickier; a GPU isn't going to render games internally in 1080i, they will have to do it at 1080p, then convert them. This means that the resolution would be higher then most games are played today, even on high end cards. I imagine that developers will have a choice of resolutions to use, but 720p will be the most common.
- The Kernel
- Emperor's Hand
- Posts: 7438
- Joined: 2003-09-17 02:31am
- Location: Kweh?!
Most high end projectors today support the 1080p resolution, as do a few plasmas. From what I gather, this is mostly done with the internal scaling chip, so you wouldn't need any more bandwidth then the regular component/DVI/HDMI interface provides.phongn wrote: No, 1080i has 1080 scanlines but only one field is drawn at a time. Don't think pixels when it comes to television, think scanlines. IIRC there is 1080p but that's fairly rare and takes a huge amount of bandwidth. I've seen HDCAM footage of it, though, very nice.
Hmm, consider myself corrected regarding interlacing.
Computer hardware is constantly scaling up, small improvements that happen every couple months... someone puts out some faster RAM, or a few hundred more megahertz in a processor, or a graphics card gets an OC upgrade... either way, every three, four months, computer hardware has taken another small incremental step forward...
Consoles, conversely, grow in leaps, but on a longer time scale... every five years or so, they get triple, quadruple, quintuple the quality. On a long-term scale, averaged out, computer hardware and console hardware advances at about the same rate... it's just that consoles make quantum leaps on the scale of years, while PC hardware makes tiny leaps constantly.
WHICH MEANS... when a console is first released, it smashes the average PC but good. This is accomplished with more efficient design, standardized hardware, etc. Of course, a console will, after two years or so, become surpassed by the average computer, and by the end of its lifespan it will be beaten by even a low-end PC. Then, of course, a new console comes out and smashes PC's all over again...
Anyway. Enough blah-blah from me. I'll be back in my corner if anyone needs me.
I don't think you understand the way console hardware evolves.And we need next gen games, not current.
Computer hardware is constantly scaling up, small improvements that happen every couple months... someone puts out some faster RAM, or a few hundred more megahertz in a processor, or a graphics card gets an OC upgrade... either way, every three, four months, computer hardware has taken another small incremental step forward...
Consoles, conversely, grow in leaps, but on a longer time scale... every five years or so, they get triple, quadruple, quintuple the quality. On a long-term scale, averaged out, computer hardware and console hardware advances at about the same rate... it's just that consoles make quantum leaps on the scale of years, while PC hardware makes tiny leaps constantly.
WHICH MEANS... when a console is first released, it smashes the average PC but good. This is accomplished with more efficient design, standardized hardware, etc. Of course, a console will, after two years or so, become surpassed by the average computer, and by the end of its lifespan it will be beaten by even a low-end PC. Then, of course, a new console comes out and smashes PC's all over again...
Anyway. Enough blah-blah from me. I'll be back in my corner if anyone needs me.
The Great and Malignant
- Ace Pace
- Hardware Lover
- Posts: 8456
- Joined: 2002-07-07 03:04am
- Location: Wasting time instead of money
- Contact:
You realize that the the console companies lose money for every console sale...Right? They make it up by up-marking games, they can't that kind of hardware and sell it at a 'fair' price, much less a price they would gain money on.Praxis wrote:In a $300 console, whoopeeAce Pace wrote:Looking at my newegg.com receipt and totaling the PC, under 1800$.The Cleric wrote:
And how much did you pay to get that quality again?
Brotherhood of the Bear | HAB | Mess | SDnet archivist |
- The Kernel
- Emperor's Hand
- Posts: 7438
- Joined: 2003-09-17 02:31am
- Location: Kweh?!
Not quite by "up-marking" games, they are able to recoup the losses by the volume of games they ship (in the console world, even a midlevel title can outsell a AAA PC title). Also, they don't consistantly lose money on the consoles--it is estimated that Microsoft was dumping around $125-$150 per Xbox sold at launch (slightly less for Sony) but they usually achieve the break even point after the first 18-24 months, and even achieve profitability thereafter on hardware sales.Ace Pace wrote: You realize that the the console companies lose money for every console sale...Right? They make it up by up-marking games, they can't that kind of hardware and sell it at a 'fair' price, much less a price they would gain money on.
The exception to this is the Gamecube which probably never resulted in a loss per unit for Nintendo. This is why the Gamecube isn't as powerful as the Playstation and Xbox (relative to its release date in any case); it uses a cheap off-the-shelf integrated PowerPC CPU (typically reserved for high volume routers) and a DirectX 7 era graphics unit.
Well, yes, but 1080p is (as I noted) fairly rare. Most people won't be able to afford it for quite awhile.The Kernel wrote:Most high end projectors today support the 1080p resolution, as do a few plasmas. From what I gather, this is mostly done with the internal scaling chip, so you wouldn't need any more bandwidth then the regular component/DVI/HDMI interface provides.
You are aware that many big screen televisions are CRT-based, yes?Vertigo1 wrote:I say screw both HD and the CRT. Make the fuckers with a DVI output and run that to a bigscreen.
No, progressive scan is 30 frames per second. You are correct that interlaced is 60 fields per second.Praxis wrote:So, if I understand this right, i (such was 1080i) draws about half the scanlines every 1/60th of a second, doing them all in 1/30th of a second, and p (such as 720p) draws the whole screen every 1/60th of a second.