Alien: Colonial Marines

GEC: Discuss gaming, computers and electronics and venture into the bizarre world of STGODs.

Moderator: Thanas

bilateralrope
Sith Acolyte
Posts: 6167
Joined: 2005-06-25 06:50pm
Location: New Zealand

Re: Alien: Colonial Marines

Post by bilateralrope »

TheFeniX wrote:Grah! The board update ate my draft. I was really drunk while writing it, so I usually avoid posting until my sobriety gets a chance to read it over. You get the short version.
That's still something I can reply to.
SteamSpy works fairly well and only tracks about 450k active Batman games. That's still good money, but doesn't really compete with the console sales.
You haven't produced the console numbers to compare the PC numbers with.

Which Batman game ?
Because if you're talking about Arkham Knight, there are two big factors that would keep SteamSpys numbers down:
- Refunds. A refunded game is no longer owned. Arkham Knight got a lot of them.
- Arkham Knight was pulled from Steam a few days after release.

So I don't see why I could consider Arkham Knight numbers as useful data.
EA breaks out it's PC sales into everything aside from mobile games.
What about the numbers from Ubisoft or CD Projekt RED ?
This means we really can't see how much money they are raking in off their shitty browser games and other content way outside the "AAA action orra-whatever" market.
TotalBiscuit put up a video titled I will now talk about why core games fail on mobile for just over 30 minutes. I think he has a point, mobile games ported to PC will always do badly because of the limitations forced upon them by the touchscreen being the only input.
Your actual games requiring some horsepower usually dominate on consoles and merely do damn well on PC.
Prove the "merely do damn well on PC" part. Specifically, I want you to compare PC, PS4 and XBone sales numbers for a game with a decent PC port. I remember the ports for Mad Max and Shadows of Mordor being well regarded, so I can guarantee I won't argue the "decent PC port" if you use them. But there are plenty of other games you could use instead.

You will need to show significantly less sales on PC than on either of those consoles to prove your position.
User avatar
Purple
Sith Acolyte
Posts: 5233
Joined: 2010-04-20 08:31am
Location: In a purple cube orbiting this planet. Hijacking satellites for an internet connection.

Re: Alien: Colonial Marines

Post by Purple »

TheFeniX wrote:That is not a 5% difference. To me, it's like watching a video vs a power-point slide-show. It's grating and it gives me headaches.
A headache? Er... I have newer had that sort of problems. Like as long as the frame rate is stable I genuinely don't much really care if a game is doing 15 or 50.
And since frame-rate drops are and always have been a thing in everything since Space Invaders, going sub-30 is excruciating even with motion blur.
That's the thing. We are talking under the assumption of a stable frame rate. As I mentioned before I too can notice changes. Maybe not +/- 1 like you can but I do notice them. And I find that in any game that stability counts for a lot more than the actual rate. For me a stable 15 looks better than an unstable 55-60.
For the record, I notice single-digit drops in FPS when running at 60FPS instantly. It's not rocket science, I've just been playing video games that long. Depending on the source, I can also instantly tell if I'm not running 1080p. Then again, for 1080i, I have to see it in motion. But 720p vs 1080p is instantly apparent.
It depends on the source though. Like most of the stuff I've seen looks almost exactly the same in both. You really have to optimize your graphics to get a difference worth noticing. It's down to the textures more than anything.
Tick rate is just one facet of why higher FPS is superior. And Shooters are not the only games it is desirable.
Thing is, I only see it as an issue if two players have different frame rates because this does give an advantage to one. If both have the same FPS you really could be playing at 15 as far as I am concerned.
No, it's just the only thing you can compare them to. Sub 60 framerates even in RTS are annoying. I'd much rather have a smooth transition across the map, being able to make out units while scrolling, than the screen being this jumbled mess of individual frames that makes my head want to explode.
A jumbled mess of individual frames? What are you talking about? Like, can you even watch movies than? IIRC all cinema movies and cartoons are done at 24 FPS. According to you that should be unwatchable.
I had this issue in Starcraft 2 once. I was getting >30 FPS. I was getting nauseous. No idea what was going on. Not until I looked and saw I had left Skyrim running in the background. Fixed that real quick.
I have newer once gotten ill from watching a computer monitor. Like at this point I am beginning to suspect you just have a strange medical issue.
But there are certain things less experienced gamers "do not mind." Hardware vs software mouse and acceleration (software is fucking awful and lags behind your actual movements) is one of them. That you don't get it doesn't matter. That it doesn't affect you doesn't mean it's not a huge issue to some people. No offense, but when you say "not a big deal, it's only 5%," you're basically saying "I have no idea what I'm talking about."
It really isn't a huge issue though. Certainly not as huge as you make it out to be. Like you make it sound as if the wrong option is going to introduce 1-2 seconds of lag in your mouse and not several measly milliseconds. I mean sure, I can see how that sort of thing might be important if you are Korean and want to hone your 2000 clicks per second skill for Starcraft or if you are playing a FPS in a tournament professionally but I fail to see how that is relevant to the 90% of gaming population who don't.
Last edited by Purple on 2015-11-02 05:33am, edited 2 times in total.
It has become clear to me in the previous days that any attempts at reconciliation and explanation with the community here has failed. I have tried my best. I really have. I pored my heart out trying. But it was all for nothing.

You win. There, I have said it.

Now there is only one thing left to do. Let us see if I can sum up the strength needed to end things once and for all.
User avatar
Purple
Sith Acolyte
Posts: 5233
Joined: 2010-04-20 08:31am
Location: In a purple cube orbiting this planet. Hijacking satellites for an internet connection.

Re: Alien: Colonial Marines

Post by Purple »

accidentally hit the quote button instead of the edit button
It has become clear to me in the previous days that any attempts at reconciliation and explanation with the community here has failed. I have tried my best. I really have. I pored my heart out trying. But it was all for nothing.

You win. There, I have said it.

Now there is only one thing left to do. Let us see if I can sum up the strength needed to end things once and for all.
User avatar
TheFeniX
Sith Marauder
Posts: 4869
Joined: 2003-06-26 04:24pm
Location: Texas

Re: Alien: Colonial Marines

Post by TheFeniX »

bilateralrope wrote:Prove the "merely do damn well on PC" part. Specifically, I want you to compare PC, PS4 and XBone sales numbers for a game with a decent PC port. I remember the ports for Mad Max and Shadows of Mordor being well regarded, so I can guarantee I won't argue the "decent PC port" if you use them. But there are plenty of other games you could use instead.
Bethesda had a blog post a few years back talking about how the majority of pre-order money came out of consoles (like 80%+). Activision posted something similar about MW3 (or maybe it was 2). Those posts are gone and tracking per-order and day 1 release sales for digital distribution is kind of a "had to be there" deal. But they ended up making tons of money off PC in the long run due to sales, etc. However, the current emphasis on the "AAA blockbuster" being on pre-orders means they considered those types of figures worthless. They want their money and they want it now.

I plan to keep an eye on how the Fallout 4 release goes as it's going to be a big one, but at this point in time: I'll concede you the argument.
Purple wrote:A headache? Er... I have newer had that sort of problems. Like as long as the frame rate is stable I genuinely don't much really care if a game is doing 15 or 50.
If you forced me to play at a "stable 15FPS," I'd rather not play the game. The amount of eye-strain alone that creates for me in incredible. You call it some kind of sickness, but to me it's no different than if I walked outside and there was a persistent "strobe light" effect on all movement. Cars not driving down the street, but teleporting a few feet at a time.
That's the thing. We are talking under the assumption of a stable frame rate. As I mentioned before I too can notice changes. Maybe not +/- 1 like you can but I do notice them. And I find that in any game that stability counts for a lot more than the actual rate. For me a stable 15 looks better than an unstable 55-60.
To be fair, I meant "single-digitS." At 60FPS, I can generally notice a 4FPS drop or more. At 30, it probably is in the 1-2 range. And 55-60 isn't what I would call "unstable." Honestly, if 15FPS gaming is fine to you, I doubt you could notice a 5FPS drop at 60FPS.
It depends on the source though. Like most of the stuff I've seen looks almost exactly the same in both. You really have to optimize your graphics to get a difference worth noticing. It's down to the textures more than anything.
Just cranking the resolution of Final Fantasy X up to 1080p or greater on my PS2 emulator makes a world of difference in texture quality. It all depends on the game. Some older games already had "hi-res" (by that time standard) textures ready to display, but the native resolution just made them look like muddy garbage. That's kind of the thing, I'm sure the textures in FFX were in the 1024x768 range at least, but the average TV at the time was like 240i.

For some reason, Fable 3 looks markedly better at 4K than most games I run at that resolution. Certain textures are garbage, but for some reason 4K really brings out the normal mapping. I wonder if they threw in something like 2K textures because "Who cares" or in an effort to fight piracy by boosting the download size.
Thing is, I only see it as an issue if two players have different frame rates because this does give an advantage to one. If both have the same FPS you really could be playing at 15 as far as I am concerned.
No. It makes it easier to detect motion. Or more specifically, motion against the background. Higher FPS means we're also able to track targets better since were used to see things in higher-"FPS" anyway. In games where that isn't important, it's nice to be able to see what the fuck is going on. So more FPS is still better.
A jumbled mess of individual frames? What are you talking about? Like, can you even watch movies than? IIRC all cinema movies and cartoons are done at 24 FPS. According to you that should be unwatchable.
Left 4 Dead has a tactic called "corner humping" where all the survivors huddled in a corner and spammed melee. With this, all the zombies clipped into each other and became this flailing mass of.... muddy textures and limbs on the Xbox 360. Playing in on PC at 60FPS, I can make out individual faces.

You ever wonder why fast paced fight scenes in movies are a jumbled mess of garbage? Like, can anyone even see what's going on during a fight scene in a Transformers movie? And movies are still watchable for me because 99% of the movies I watch are at 24FPS. I have no reference to complain otherwise. But start producing movies at nothing but 60FPS, get people used to them (note: The Hobbit caused nausea in some viewers) and going back to 24FPS would be difficult.

Go watch a soap-opera for 5 minutes and try to see just how much more detailed everything is in motion.
It really isn't a huge issue though. Certainly not as huge as you make it out to be. Like you make it sound as if the wrong option is going to introduce 1-2 seconds of lag in your mouse and not several measly milliseconds. I mean sure, I can see how that sort of thing might be important if you are Korean and want to hone your 2000 clicks per second skill for Starcraft or if you are playing a FPS in a tournament professionally but I fail to see how that is relevant to the 90% of gaming population who don't.
They take the same attitude you do: "I don't notice it, it's not a problem." I do notice it. It affects my enjoyment playing the game. I have the hardware to not deal with it. I don't deal with it.

There is no game I could play where 30FPS would be more enjoyable to play at than 60FPS. That's why it's a big deal.
User avatar
Purple
Sith Acolyte
Posts: 5233
Joined: 2010-04-20 08:31am
Location: In a purple cube orbiting this planet. Hijacking satellites for an internet connection.

Re: Alien: Colonial Marines

Post by Purple »

TheFeniX wrote:If you forced me to play at a "stable 15FPS," I'd rather not play the game. The amount of eye-strain alone that creates for me in incredible. You call it some kind of sickness, but to me it's no different than if I walked outside and there was a persistent "strobe light" effect on all movement. Cars not driving down the street, but teleporting a few feet at a time.
Say what? :wtf: The only time I have ever seen such an effect on a computer is with MMO's when the server is too full and they lag up. Certainly not in a half competently programed computer game otherwise.

Basically it's a question of how the game timers are done. If you have a separate game timer to the graphical timer you will experience frame drop when the two dip too much out of sync. But any half competent programer making something where it can be avoided (as in anything singleplayer) will make sure to avoid that by slowing the other timer down too if things get too bad so you don't notice as much.
To be fair, I meant "single-digitS." At 60FPS, I can generally notice a 4FPS drop or more. At 30, it probably is in the 1-2 range. And 55-60 isn't what I would call "unstable." Honestly, if 15FPS gaming is fine to you, I doubt you could notice a 5FPS drop at 60FPS.
That's the thing. I actually do mind the 60 to 55 drop more than playing at 55 constantly. You notice the change in flow speed, to put it that way.
For some reason, Fable 3 looks markedly better at 4K than most games I run at that resolution. Certain textures are garbage, but for some reason 4K really brings out the normal mapping. I wonder if they threw in something like 2K textures because "Who cares" or in an effort to fight piracy by boosting the download size.
I would not at all be surprised if the 2K textures are actually the same as 4K just with the extra effects removed. The same diffuse map taken without the bump and specular channels will take up a lot less room in your graphics RAM but the visual difference will be incredible.
No. It makes it easier to detect motion. Or more specifically, motion against the background. Higher FPS means we're also able to track targets better since were used to see things in higher-"FPS" anyway. In games where that isn't important, it's nice to be able to see what the fuck is going on. So more FPS is still better.
I generally find that it's something you can easily get used to though. Like as I said I play M&B capped at 15 manually and I don't have any issues tracking movement. Maybe my eyes are just magically better at adapting than everyone else. But I doubt it.
Left 4 Dead has a tactic called "corner humping" where all the survivors huddled in a corner and spammed melee. With this, all the zombies clipped into each other and became this flailing mass of.... muddy textures and limbs on the Xbox 360. Playing in on PC at 60FPS, I can make out individual faces.
Based solely on your description I'd say that the effect you describe is not what you think it is. Basically what it sounds like to me is that they tied the physics timer to the graphics timer so that when the FPS drops the rate of physics calculations drop as well. And thus the physics engine is not doing its job at preventing collision. Ordinarily you want to have those two separate.
You ever wonder why fast paced fight scenes in movies are a jumbled mess of garbage?
They aren't. Not for me and apparently not for most people on this earth.
Like, can anyone even see what's going on during a fight scene in a Transformers movie?
Yes? I mean, I can see exactly what's going on. It's just that the content genuinely is just a bunch of spinning flailing limbs and pointless crashing into one another.
And movies are still watchable for me because 99% of the movies I watch are at 24FPS. I have no reference to complain otherwise. But start producing movies at nothing but 60FPS, get people used to them (note: The Hobbit caused nausea in some viewers) and going back to 24FPS would be difficult.
That's the thing though. If you can watch a movie at 24 with no ill effects than you should be able to play a game at equal FPS with no ill effects as well. And vice versa. Anything else means that the issue boils down to "I like it more so I want it to become standard." and not some sort of objective measure.
Go watch a soap-opera for 5 minutes and try to see just how much more detailed everything is in motion.
Will a non action movie do?
They take the same attitude you do: "I don't notice it, it's not a problem." I do notice it. It affects my enjoyment playing the game. I have the hardware to not deal with it. I don't deal with it.
I see. Well I can certainly understand that attitude and find it completely understandable and acceptable. I was just hoping that there was more to this discussion than just a matter of tastes.
It has become clear to me in the previous days that any attempts at reconciliation and explanation with the community here has failed. I have tried my best. I really have. I pored my heart out trying. But it was all for nothing.

You win. There, I have said it.

Now there is only one thing left to do. Let us see if I can sum up the strength needed to end things once and for all.
User avatar
TheFeniX
Sith Marauder
Posts: 4869
Joined: 2003-06-26 04:24pm
Location: Texas

Re: Alien: Colonial Marines

Post by TheFeniX »

Purple wrote:Say what? :wtf: The only time I have ever seen such an effect on a computer is with MMO's when the server is too full and they lag up. Certainly not in a half competently programed computer game otherwise.
Did you try setting Counter-Strike to 15FPS? If you could game at that FPS: kudos. To me, it's virtually unplayable. There's a difference between latency and frame-rate drop, though in MMOs, they can sometimes be very closely related. Though, even if you don't know it: your gaming experience is suffering.
I would not at all be surprised if the 2K textures are actually the same as 4K just with the extra effects removed. The same diffuse map taken without the bump and specular channels will take up a lot less room in your graphics RAM but the visual difference will be incredible.
No, 2K and 4K are resolutions. There's nothing to "take out."
I generally find that it's something you can easily get used to though. Like as I said I play M&B capped at 15 manually and I don't have any issues tracking movement. Maybe my eyes are just magically better at adapting than everyone else. But I doubt it.
Yea, and I "got used" to gaming at 30FPS on console (sans CoD). Doesn't mean I don't vastly prefer 60FPS without motion blur. Mount and blade is not an FPS where you're engaging moving targets, manually, at distances ranging from point-blank to the engine draw distance. That said, at higher FPS, you would probably enjoy the battle animations much more due to being able to actually see them clearly.

Just like standard definition was "perfectly ok" up until HD came out and people realized just how much quality they were missing out on.
Based solely on your description I'd say that the effect you describe is not what you think it is. Basically what it sounds like to me is that they tied the physics timer to the graphics timer so that when the FPS drops the rate of physics calculations drop as well. And thus the physics engine is not doing its job at preventing collision. Ordinarily you want to have those two separate.
Zombies don't have collision with other zombies in Left 4 dead. Survivors also only have limited collision with each other. This is purely a product of Frame-rate.

You literally just do not understand the difference frame-rate makes in the eye's ability to track moving objects.
They aren't. Not for me and apparently not for most people on this earth.
Not surprising you'd think that actually.
That's the thing though. If you can watch a movie at 24 with no ill effects than you should be able to play a game at equal FPS with no ill effects as well. And vice versa. Anything else means that the issue boils down to "I like it more so I want it to become standard." and not some sort of objective measure.
Your opinion is bad and you should feel bad. "Most people enjoy shitty reality TV shows, you should be happy with that." 60FPS is objectively better than 30FPS. Like I've already said, that you can't notice the difference doesn't make that any less true.

Besides, 60FPS had always been the PC standard Frame-rate up until console hardware couldn't hack it. 30FPS is a choice based solely on "our hardware is shit." It's not done for any other objective reason. There isn't a single cross-plat I've played at 30FPS on console and 60FPS on PC where the PC version didn't give smoother and crisper gameplay without trying (and failing) to rely on motion-blur as a duct-taped solution.
Will a non action movie do?
Are they shot in 60FPS? It's not even a fucking argument anymore after the Hobbit
Then it began. The smoothness is readily noticeable. With double the number of frames available, there is less blurring of motion. This is especially noticeable in fast pans, where everything stays in absolute focus. Characters moved with a smoothness you've never seen on the silver screen.

I didn't hate it, at least not so far as the motion went. I suppose that after a while, I could get used to it. However, there was a much larger, and more fatal issue. I couldn't suspend disbelief. Not in the slightest. Not for a second.

Our entire lives we've been conditioned to accept the aesthetics of 24fps "film" as fiction, and higher framerates (like "video") as reality. Think local news versus any movie. "Reality" TV versus scripted primetime TV. There is no technological reason why TV dramas and comedies are shot at 24fps (or are tweaked in post-production to look like they were). Yet, they all are. I think this is why Blair Witch Project captivated so many, it was shot on video, and was therefore "reality." It's also why a movie like Cloverfield didn't grab on that level. Despite pretending to be "found footage," it still looked like film, and therefore "fiction" (that was a missed opportunity, in my book).
Not that it ever was before, people just didn't know any better because examples of high frame-rate video, such as home movies and Soap Operas lack the after-effects of production movies and TV shows. You been conditioned to associate 24FPS and all the associated after-effects as what a movie should look like. So, when you see something different, you scream "fake."

As someone who grew up playing a majority of PC games at 60FPS, 30FPS with motion blur is a piss-poor substitute since Motion Blur is a crutch for 30FPS and actual additive for 60FPS (Portal 2 being a good example). The games will always feel blurry in motion and less-responsive. That's not even an opinion.
I see. Well I can certainly understand that attitude and find it completely understandable and acceptable. I was just hoping that there was more to this discussion than just a matter of tastes.
If you enjoy a sub-par entertainment experience: then yes, it is all about taste.
User avatar
Purple
Sith Acolyte
Posts: 5233
Joined: 2010-04-20 08:31am
Location: In a purple cube orbiting this planet. Hijacking satellites for an internet connection.

Re: Alien: Colonial Marines

Post by Purple »

TheFeniX wrote:Did you try setting Counter-Strike to 15FPS? If you could game at that FPS: kudos. To me, it's virtually unplayable. There's a difference between latency and frame-rate drop, though in MMOs, they can sometimes be very closely related. Though, even if you don't know it: your gaming experience is suffering.
The end result is the same though. What you see is out of synchronization with what happens.
No, 2K and 4K are resolutions. There's nothing to "take out."
My point was that you can fake a 4K effect by having 2K textures and than adding on a bunch of effects such as bump maps etc. to a 2K texture. Which is what I imagine a lot of games do. And that what you see as the 2K version might actually be the same diffuse map used for the 4K but with the effects that make it pop, stand out and look more detailed than it is removed.
Zombies don't have collision with other zombies in Left 4 dead. Survivors also only have limited collision with each other. This is purely a product of Frame-rate.
Well if they don't have collision than it's a product of not having collision. If there is neither a physical law nor an AI program preventing them from meshing together than they will. The only explanation for what you describe given what you now said is that there actually is AI code to have them not collide but not a law that prevents this so they walk into one another due to faulty AI code.
You literally just do not understand the difference frame-rate makes in the eye's ability to track moving objects.
No I do not because I do not have issues tracking moving objects in movies (24fps) or games running slower than you like them. My brain is perfectly capable of filling in the gaps and giving me a sense of continuity every time I blink.

Your opinion is bad and you should feel bad. "Most people enjoy shitty reality TV shows, you should be happy with that." 60FPS is objectively better than 30FPS. Like I've already said, that you can't notice the difference doesn't make that any less true.
On the contrary. If I do not see a difference in a factor than from my perspective money, effort and time thrown at improving said factor beyond the limit of my noticing is wasted and could have been used for more important things.
Besides, 60FPS had always been the PC standard Frame-rate up until console hardware couldn't hack it. 30FPS is a choice based solely on "our hardware is shit." It's not done for any other objective reason. There isn't a single cross-plat I've played at 30FPS on console and 60FPS on PC where the PC version didn't give smoother and crisper gameplay without trying (and failing) to rely on motion-blur as a duct-taped solution.
I genuinely have not played any console ports that I know off. Well actually there was this one game but it's like so old it does not count. Like X-Box 1 old. So I can't comment on this.
Then it began. The smoothness is readily noticeable. With double the number of frames available, there is less blurring of motion. This is especially noticeable in fast pans, where everything stays in absolute focus. Characters moved with a smoothness you've never seen on the silver screen.
To be fair, what ever effect is noticeable will be greatly amplified by the large screen and other movie theater conditions.

I didn't hate it, at least not so far as the motion went. I suppose that after a while, I could get used to it. However, there was a much larger, and more fatal issue. I couldn't suspend disbelief. Not in the slightest. Not for a second.
Not that it ever was before, people just didn't know any better because examples of high frame-rate video, such as home movies and Soap Operas lack the after-effects of production movies and TV shows. You been conditioned to associate 24FPS and all the associated after-effects as what a movie should look like. So, when you see something different, you scream "fake."
??? I am not sure why you are responding to a quote from the article. But like I don't scream fake at the screen. I as a general rule know and understand that things on screen look fake. I just don't care. Of course they look fake. It's all special effects and makeup and stuff. It's supposed to look entertaining, not realistic.
As someone who grew up playing a majority of PC games at 60FPS, 30FPS with motion blur is a piss-poor substitute since Motion Blur is a crutch for 30FPS and actual additive for 60FPS (Portal 2 being a good example). The games will always feel blurry in motion and less-responsive. That's not even an opinion.
From my experience motion blur is just a cheap crutch period. It has the exact same effect irregardless of the FPS. My guess is that your persona bias toward 60 simply adds into your experience at 60 making a cheap crutch look less cheap.
If you enjoy a sub-par entertainment experience: then yes, it is all about taste.
I just really do not get hung up about technical specs but about what I actually see. Again, it's the MP3 vs analog sound debate all over again.
It has become clear to me in the previous days that any attempts at reconciliation and explanation with the community here has failed. I have tried my best. I really have. I pored my heart out trying. But it was all for nothing.

You win. There, I have said it.

Now there is only one thing left to do. Let us see if I can sum up the strength needed to end things once and for all.
User avatar
TheFeniX
Sith Marauder
Posts: 4869
Joined: 2003-06-26 04:24pm
Location: Texas

Re: Alien: Colonial Marines

Post by TheFeniX »

Purple wrote:The end result is the same though. What you see is out of synchronization with what happens.
No, with low latency and low frame-rate, you will have as up to date information on where all assets are on screen, just given at lower intervals. With hi-frame-rate and high latency you will either have models outpacing their hitboxes (making aiming or, in the case of MMOs, melee attacks much more difficult) even though the gameplay is smooth. In certain other games, models will "rubber-band" as the server and your client struggle to determine where every asset is actually located and render in correctly. It's all dependent on the netcode.
My point was that you can fake a 4K effect by having 2K textures and than adding on a bunch of effects such as bump maps etc. to a 2K texture. Which is what I imagine a lot of games do. And that what you see as the 2K version might actually be the same diffuse map used for the 4K but with the effects that make it pop, stand out and look more detailed than it is removed.
You can't "Fake" an effect like that. A 4k red box is still just a red box and will paint whatever it is skinned to as red. The application of maps is separate from resolutions. If you have a 4k texture and normal map: that mapping is applied no matter what resolution you run the game at.

If anything, the greater resolution just makes it possible for the client to render more subtle variances supplied by the maps.
Well if they don't have collision than it's a product of not having collision. If there is neither a physical law nor an AI program preventing them from meshing together than they will. The only explanation for what you describe given what you now said is that there actually is AI code to have them not collide but not a law that prevents this so they walk into one another due to faulty AI code.
You aren't understanding my point. The mass of flailing zombies limbs and faces is being rendered the exact same. They are moving the exact same. Everything is the exact same except the framerate. However, at 60 FPS I can make out individual zombies in this massive clipping nightmare. At 30 FPS, you get one or two stand-outs every once in a while. A lot of that has to do with motion blur, but 30FPS without motion blur is a slide-show even most console players notice.
No I do not because I do not have issues tracking moving objects in movies (24fps) or games running slower than you like them. My brain is perfectly capable of filling in the gaps and giving me a sense of continuity every time I blink.
I played Zork. And since my brain can visualize text, Zork has the same graphical fidelity as Crysis. Right?
On the contrary. If I do not see a difference in a factor than from my perspective money, effort and time thrown at improving said factor beyond the limit of my noticing is wasted and could have been used for more important things.
So, then it's ok to say it's just as good, even though you admit it's not and you only feel that way because you can't notice the difference.

Yes, obviously someone with bad eye-sight isn't going to need to waste money on a Blu-Ray player, but most of those people don't go around calling HD a waste or saying "I don't get the point, you should be fine with 480i. I KNOW I AM!"
I just really do not get hung up about technical specs but about what I actually see. Again, it's the MP3 vs analog sound debate all over again.
No, this is like a color-blind person saying 8-bit color should be just fine because it gives him enough shades of gray to "fill in the blanks." And that the rest of us are fools for demanding an expanded color palette when hardware has been capable of displaying it for 20 years.
Grumman
Jedi Council Member
Posts: 2488
Joined: 2011-12-10 09:13am

Re: Alien: Colonial Marines

Post by Grumman »

TheFeniX wrote:Are they shot in 60FPS? It's not even a fucking argument anymore after the Hobbit
Then it began. The smoothness is readily noticeable. With double the number of frames available, there is less blurring of motion. This is especially noticeable in fast pans, where everything stays in absolute focus. Characters moved with a smoothness you've never seen on the silver screen.

I didn't hate it, at least not so far as the motion went. I suppose that after a while, I could get used to it. However, there was a much larger, and more fatal issue. I couldn't suspend disbelief. Not in the slightest. Not for a second.

Our entire lives we've been conditioned to accept the aesthetics of 24fps "film" as fiction, and higher framerates (like "video") as reality. Think local news versus any movie. "Reality" TV versus scripted primetime TV. There is no technological reason why TV dramas and comedies are shot at 24fps (or are tweaked in post-production to look like they were). Yet, they all are. I think this is why Blair Witch Project captivated so many, it was shot on video, and was therefore "reality." It's also why a movie like Cloverfield didn't grab on that level. Despite pretending to be "found footage," it still looked like film, and therefore "fiction" (that was a missed opportunity, in my book).
Not that it ever was before, people just didn't know any better because examples of high frame-rate video, such as home movies and Soap Operas lack the after-effects of production movies and TV shows. You been conditioned to associate 24FPS and all the associated after-effects as what a movie should look like. So, when you see something different, you scream "fake."
I don't even know that that is true. The Hobbit did look fake, but blaming that on the higher framerate when it has scenes like this in it doesn't give Peter Jackson the blame he deserves for breaking people's suspension of disbelief.
User avatar
Elheru Aran
Emperor's Hand
Posts: 13073
Joined: 2004-03-04 01:15am
Location: Georgia

Re: Alien: Colonial Marines

Post by Elheru Aran »

Grumman wrote: I don't even know that that is true. The Hobbit did look fake, but blaming that on the higher framerate when it has scenes like this in it doesn't give Peter Jackson the blame he deserves for breaking people's suspension of disbelief.
Without getting into the game debate-- The Hobbit looked fake because it had way too damn much slightly obvious CGI in it, not because of its frame rate. As I've said before, compare it to Lord of the Rings, which looks like a National Geographic special; The Hobbit looks like Metal Gear on the PS or something.
It's a strange world. Let's keep it that way.
Post Reply