bilateralrope wrote:Prove the "merely do damn well on PC" part. Specifically, I want you to compare PC, PS4 and XBone sales numbers for a game with a decent PC port. I remember the ports for Mad Max and Shadows of Mordor being well regarded, so I can guarantee I won't argue the "decent PC port" if you use them. But there are plenty of other games you could use instead.
Bethesda had a blog post a few years back talking about how the majority of pre-order money came out of consoles (like 80%+). Activision posted something similar about MW3 (or maybe it was 2). Those posts are gone and tracking per-order and day 1 release sales for digital distribution is kind of a "had to be there" deal. But they ended up making tons of money off PC in the long run due to sales, etc. However, the current emphasis on the "AAA blockbuster" being on pre-orders means they considered those types of figures worthless. They want their money and they want it now.
I plan to keep an eye on how the Fallout 4 release goes as it's going to be a big one, but at this point in time: I'll concede you the argument.
Purple wrote:A headache? Er... I have newer had that sort of problems. Like as long as the frame rate is stable I genuinely don't much really care if a game is doing 15 or 50.
If you forced me to play at a "stable 15FPS," I'd rather not play the game. The amount of eye-strain alone that creates for me in incredible. You call it some kind of sickness, but to me it's no different than if I walked outside and there was a persistent "strobe light" effect on all movement. Cars not driving down the street, but teleporting a few feet at a time.
That's the thing. We are talking under the assumption of a stable frame rate. As I mentioned before I too can notice changes. Maybe not +/- 1 like you can but I do notice them. And I find that in any game that stability counts for a lot more than the actual rate. For me a stable 15 looks better than an unstable 55-60.
To be fair, I meant "single-digitS." At 60FPS, I can generally notice a 4FPS drop or more. At 30, it probably is in the 1-2 range. And 55-60 isn't what I would call "unstable." Honestly, if 15FPS gaming is fine to you, I doubt you could notice a 5FPS drop at 60FPS.
It depends on the source though. Like most of the stuff I've seen looks almost exactly the same in both. You really have to optimize your graphics to get a difference worth noticing. It's down to the textures more than anything.
Just cranking the resolution of Final Fantasy X up to 1080p or greater on my PS2 emulator makes a world of difference in texture quality. It all depends on the game. Some older games already had "hi-res" (by that time standard) textures ready to display, but the native resolution just made them look like muddy garbage. That's kind of the thing, I'm sure the textures in FFX were in the 1024x768 range at least, but the average TV at the time was like 240i.
For some reason, Fable 3 looks markedly better at 4K than most games I run at that resolution. Certain textures are garbage, but for some reason 4K really brings out the normal mapping. I wonder if they threw in something like 2K textures because "Who cares" or in an effort to fight piracy by boosting the download size.
Thing is, I only see it as an issue if two players have different frame rates because this does give an advantage to one. If both have the same FPS you really could be playing at 15 as far as I am concerned.
No. It makes it easier to detect motion. Or more specifically, motion against the background. Higher FPS means we're also able to track targets better since were used to see things in higher-"FPS" anyway. In games where that isn't important, it's nice to be able to see what the fuck is going on. So more FPS is still better.
A jumbled mess of individual frames? What are you talking about? Like, can you even watch movies than? IIRC all cinema movies and cartoons are done at 24 FPS. According to you that should be unwatchable.
Left 4 Dead has a tactic called "corner humping" where all the survivors huddled in a corner and spammed melee. With this, all the zombies clipped into each other and became this flailing mass of.... muddy textures and limbs on the Xbox 360. Playing in on PC at 60FPS, I can make out individual faces.
You ever wonder why fast paced fight scenes in movies are a jumbled mess of garbage? Like, can anyone even see what's going on during a fight scene in a Transformers movie? And movies are still watchable for me because 99% of the movies I watch are at 24FPS. I have no reference to complain otherwise. But start producing movies at nothing but 60FPS, get people used to them (note: The Hobbit caused nausea in some viewers) and going back to 24FPS would be difficult.
Go watch a soap-opera for 5 minutes and try to see just how much more detailed everything is in motion.
It really isn't a huge issue though. Certainly not as huge as you make it out to be. Like you make it sound as if the wrong option is going to introduce 1-2 seconds of lag in your mouse and not several measly milliseconds. I mean sure, I can see how that sort of thing might be important if you are Korean and want to hone your 2000 clicks per second skill for Starcraft or if you are playing a FPS in a tournament professionally but I fail to see how that is relevant to the 90% of gaming population who don't.
They take the same attitude you do: "I don't notice it, it's not a problem." I do notice it. It affects my enjoyment playing the game. I have the hardware to not deal with it. I don't deal with it.
There is no game I could play where 30FPS would be more enjoyable to play at than 60FPS. That's why it's a big deal.