Watch Dogs Graphics Downgraded on PC

GEC: Discuss gaming, computers and electronics and venture into the bizarre world of STGODs.

Moderator: Thanas

Post Reply
User avatar
salm
Rabid Monkey
Posts: 10296
Joined: 2002-09-09 08:25pm

Re: Watch Dogs Graphics Downgraded on PC

Post by salm »

bilateralrope wrote:
salm wrote:Features get droped all the time by every developer for every kind of product. Dropping features is an essential part of every development process. Just because some games come with a certain feature doesn´t mean that it´s not a feature widely dropped.
The problem is that Ubisoft is willing to drop working features. Like the E3 graphics in Watch Dogs. Then spin some obvious bullshit about "possible impacts on visual fidelity, stability, performance and overall gameplay quality" to justify it. Not impacts they had seen, but possible ones.
I´m not arguing that. I´m arguing that the 1 or 2 days estimation of Jonathan Cooper might not be correct depending on what quality you expect of a character.

I don´t like gaming giants like Ubisoft and a lot of their business practices. There are plenty of reasons to attack them. Coopers claim is not one of them or at least there´s not enough evidence that we can judge it.
You can complain about the fact that they weren´t willing to invest more time for a female character but claiming that it would only take one or two days is most likely wrong.
User avatar
TheFeniX
Sith Marauder
Posts: 4869
Joined: 2003-06-26 04:24pm
Location: Texas

Re: Watch Dogs Graphics Downgraded on PC

Post by TheFeniX »

salm wrote:All of this is stuff you can not do parallel because every step of the pipe line requires the first one to be finished, so you can not simply throw money at it.
This isn't some type of content that relies heavily on other assets, unless even the co-op partner has their own dialog, rather than just being tied to the hosting player's game as a dummy (much like Halo co-op). If the engine is in any way easy to work with, you can literally stick a guy in a back room to hammer all this stuff out, then drop it in when done. 10-to-1 female characters end up being DLC anyways. And they end up being nothing but model and audio swaps. Maybe they'll have a gratuitous walk animation, but that's about it.

Besides, no animations they put in could be worse than what they showed at E3. Good lord, that leg sweep animation looks bad even for the original Mortal Kombat.

I'm interested to know if the face and animating is part of the game proper, or some kind of high-res/poly model they use only for quasi-cutscenes.
Correct me if I´m wrong but it seems like the last AC game he worked on was AC3 which came out two years ago for last gen platforms. The new game is for this gen platforms and customers expectations of graphical bling bling has gone up significantly. The new pipe lines might be a lot more complex than what he knows from the AC game he worked on.
Maybe expectations have gone up, but nothing about AC: Unity looks next gen. Even the co-op is watered down compared to years old Ubisoft property. That's seems to be their current Dogma: watered down.
Now, taking all this information doesn´t disprove that the animation (without the rest of the pipe line) could be done in two days but it does give me a reason not to give this guys words any weight. It looks increadibly a lot like he fired some shots without thinking the whole thing trough.
The hyperbole isn't the point.
User avatar
salm
Rabid Monkey
Posts: 10296
Joined: 2002-09-09 08:25pm

Re: Watch Dogs Graphics Downgraded on PC

Post by salm »

Hyperbole. Yeah, right. Lol :lol:
User avatar
TheFeniX
Sith Marauder
Posts: 4869
Joined: 2003-06-26 04:24pm
Location: Texas

Re: Watch Dogs Graphics Downgraded on PC

Post by TheFeniX »

So now, Ubisoft is downgrading PS4 graphics to keep parity between Xbone and said PS4 for AssCreed: Unity.
The new current-gen and PC-only Assassin's Creed game, Assassin's Creed Unity, will run at a resolution of 900p and a frame rate of 30 frames per second on both Xbox One and PlayStation 4.

Speaking with VideoGamer, senior producer Vincent Pontbriand revealed the figures, which come in below the 1080p/60fps figures that Ubisoft was reportedly targeting for both platforms. And while there is a technical reason for not being able to reach those numbers, Ubisoft decided to make the two console versions identical in the interest of parity. "We decided to lock them at the same specs to avoid all the debates and stuff," Pontbriand explained.

What holds Unity back is the processors that the consoles are equipped with. They are what's responsible for handling AI--and there is a lot of that to be handled, as Ubisoft has claimed the game can support crowds of 30,000 NPCs.

"Technically we're CPU-bound," Pontbriand said. "The GPUs are really powerful, obviously the graphics look pretty good, but it's the CPU [that] has to process the AI, the number of NPCs we have on screen, all these systems running in parallel.

"We were quickly bottlenecked by that and it was a bit frustrating, because we thought that this was going to be a tenfold improvement over everything AI-wise, and we realized it was going to be pretty hard. It's not the number of polygons that affect the frame rate. We could be running at 100fps if it was just graphics, but because of AI, we're still limited to 30 frames per second."

We've followed up with Ubisoft to see if we can learn anything more about the situation.

Both the Xbox One and PS4 versions of Assassin's Creed IV: Black Flag ran at 900p initially, but a patch released on launch day brought the PS4 version up to the vaunted resolution of 1080p. It isn't the only game to run at a high resolution on PS4, but more recently, thanks to an update, developers have increasingly reached 1080p with the Xbox One versions of their games.

Unity is one of two original Assassin's Creed games coming on November 11, the other being Assassin's Creed Rogue for Xbox 360 and PlayStation 3. For more on Unity, check out our newly published preview.
This whole thing is funny because now SDF morons, who were basically laughing off the Watchdogs BS as PC gamers being entitled jerks, are now so butthurt they've been blowing up in Twitter because their gaming platform is now getting the shaft. I should be laughing, but I'm more annoyed that we're only a year into the generation and this shit's already getting sad.

I have to actually give some credit to Ubisoft if they're focusing on AI at the expense of graphics. But if it's Ubisoft, it's probably all bullshit anyways.
bilateralrope
Sith Acolyte
Posts: 6168
Joined: 2005-06-25 06:50pm
Location: New Zealand

Re: Watch Dogs Graphics Downgraded on PC

Post by bilateralrope »

"We decided to lock them at the same specs to avoid all the debates and stuff," Pontbriand explained.
I thought the debates between Xbox 1 and PS4 users had settled down to the point where another game coming out with different resolutions wouldn't matter. Then Ubisoft goes and reignites the debate in an attempt to avoid it.

Is there anyone competent in Ubisoft's PR department ?
User avatar
Mr Bean
Lord of Irony
Posts: 22462
Joined: 2002-07-04 08:36am

Re: Watch Dogs Graphics Downgraded on PC

Post by Mr Bean »

I'll say it again, the more I look at the PS4 and Xbone from a pure hardware prospective the more convinced I am that both sides screwed up the hardware for this generation, that the CPU choice is going to slam games into a wall a lot faster because while the GPU has jumped forward about five generations the CPU jumped sideways one generation with that 8 core GPU that's running at 1.8 ghtz. It's an AMD CPU to boot based off a competitor to Intel's ATOM series (Which it lost to) and while if you hook up an Intel i7 with a Nvidia 750 TI and compare it to a I7 even one running 500 mhtz faster there is little difference below 1080p.... tie something like what the Xbox and PS4 have and we might be seeing a situation where we hit the cap fast. Not because the GPU can't handle more but because Dev's can't make 8 CPU's preform just as fast as four CPU's going three times as fast.

"A cult is a religion with no political power." -Tom Wolfe
Pardon me for sounding like a dick, but I'm playing the tiniest violin in the world right now-Dalton
User avatar
DaveJB
Jedi Council Member
Posts: 1917
Joined: 2003-10-06 05:37pm
Location: Leeds, UK

Re: Watch Dogs Graphics Downgraded on PC

Post by DaveJB »

I wonder whether Sony and Microsoft were taking a similar sort of approach to what Nintendo did with the Wii U, which has a complete joke of a CPU (less than half the speed of the Xbox 360's CPU, by all accounts) and relies on its GPU carrying out computational functions in order to produce anything more sophisticated than GameCube-level stuff.

Of course, it's probably just as likely that both companies overreacted to the major issues of their previous models, namely the 360's initial heat and unreliability problems, and the PS3's obscenely huge manufacturing costs.
User avatar
Vendetta
Emperor's Hand
Posts: 10895
Joined: 2002-07-07 04:57pm
Location: Sheffield, UK

Re: Watch Dogs Graphics Downgraded on PC

Post by Vendetta »

It's possible, but I think it's also possible that they eased off from chasing the white whale of "moar power" because it's unsustainable.

Driving up graphical fidelity also drives up development cost and lengthens development time, which is what leads us to those comedy statements like Square Enix being "disappointed" in Tomb Raider "only" shifting six million units.

Costs are getting ludicrous, and that's driving publishers to become more and more conservative, churning out sequels and reusing formulas (especially relevant to this thread, it being Ubisoft Game) because they can literally only justify the very safest of investments in the triple-A space now.

Even in the PC space, there's actually a lot less drive than you think for "moar power!" from a development stance, Crytek aren't bothered any more, Epic's Unreal engine is all about the scalability, Blizzard make games you can run on a potato, and the only people that really are are RSI and they can only do it because they crowdfunded it by finding a niche that middle aged men with disposable income will pay large amounts of money to have filled.

The Xbox One and PS4 are already obsolete hardware, and that doesn't matter because even if they were right on the bleeding edge it wouldn't make the games better and the higher costs would have a significant chance of making them staler.
User avatar
DaveJB
Jedi Council Member
Posts: 1917
Joined: 2003-10-06 05:37pm
Location: Leeds, UK

Re: Watch Dogs Graphics Downgraded on PC

Post by DaveJB »

Oh sure, chasing the absolute performance lead with the PS4 and/or Xbone would have been pointless anyway, since PCs would have overtaken them in 12-18 months tops. It's just a little disconcerting hearing that developers are already hitting the CPU limits on the systems after barely a year... but then again Assassin's Creed might be a special case, seeing how it's pulling off crowd scenes and AI routines far larger than 98% of games are ever likely to use. Or Ubisoft may just be incompetent. :P

In any case, I have a feeling that whereas the challenge facing developers last generation (especially with the PS3) was making effective use of multithreaded code, in this one the challenge is going to be balancing out processing work between the CPU and GPU parts of the consoles.
User avatar
TheFeniX
Sith Marauder
Posts: 4869
Joined: 2003-06-26 04:24pm
Location: Texas

Re: Watch Dogs Graphics Downgraded on PC

Post by TheFeniX »

Vendetta wrote:Driving up graphical fidelity also drives up development cost and lengthens development time, which is what leads us to those comedy statements like Square Enix being "disappointed" in Tomb Raider "only" shifting six million units.
I don't think graphics in of themselves are really doing anything to drive up development costs. Making a 4k texture vs a 900p texture doesn't take anymore time to develop. Many times, developers already started at the high-end and just compress after the fact. Most modelling is also done with way more polygons than is needed and stripified after the fact. The problem with graphics is running what looks good without cratering what hardware you have.
Costs are getting ludicrous, and that's driving publishers to become more and more conservative, churning out sequels and reusing formulas (especially relevant to this thread, it being Ubisoft Game) because they can literally only justify the very safest of investments in the triple-A space now.
But is graphics really what put Tomb Raider in a financial spiral? What was the marketing cost? How much did they spend on Voice actors? How much to licensing? General overhead?

Like you, I don't really have many fucks to give about graphics. Passable is ok to me, but since going back to 60FPS gaming, 30FPS can make me nauseous. Which is fucking weird because I never had that problem when I was younger. And even then, it wouldn't piss me off so much if every god damn marketer wasn't telling me that the graphics I'm looking at, which are marginal, were the best thing ever. And now, due to inflated costs, we can't even get that because "graphics are expensive."

I get that the Xbone and Sony's new movie console have shit for hardware: but I don't. And my computer is going on 5 years with a year-old $300 video card.
User avatar
Mr Bean
Lord of Irony
Posts: 22462
Joined: 2002-07-04 08:36am

Re: Watch Dogs Graphics Downgraded on PC

Post by Mr Bean »

I think both of you are talking past the point, the problem is not the fact the Xbone and the Ps4 have two year old mid-tier GPU's the problem is they have a four year old laptop grade CPU which is prevent the GPU from doing it's job. Tom's Hardware pegged the GPU as an improved Radeon HD 7870 which should be fine for it's purposes. The Desktop 7870 can sure as shit turn in 60 FPS on Watchdogs in 1080P but the Xbone can't... the thinking is not the issue with the GPU but again the CPU just can't keep up. That an eight core 1.6 ghz CPU can't feed the GPU fast enough so it lags when your throw in CPU centric calculations like AI and physics and path finding that the CPU is failing to keep it up or the Dev's are failing to use all eight cores.

Which is odd because AMD was about to finish work on Kaveri and had well know experience with Richland and the like. We've been in a CPU freeze for going on three years now since Sandy bridge launched in 2011 where if you bought a I5 2500k or an I7 2600k there is no reason to upgrade yet. Ivy bridge, and Haswell still not offering enough reason to upgrade, if you jumped in on Sandy or Ivy bridge your set for another few years. GPU's on the desktop are to the point if you take a Sandy bridge I5-2500k running at 3 ghtz with 8 gigs of ram and put it up against a Haswell-E with 64 gigs of DD4 and OC to 4.5 ghtz and you'll see except in 4k only a few frames difference between the two because the CPU is just fine at delivering everything the GPU needs.

The question is how much performance the Xbone and Ps4 can wring out with optimizations since unlike the GPU limited Xbox 360 where dev's got really good at hiding low poly's with tons of clever tricks... it's not the GPU side that's lack this go around.

"A cult is a religion with no political power." -Tom Wolfe
Pardon me for sounding like a dick, but I'm playing the tiniest violin in the world right now-Dalton
Grumman
Jedi Council Member
Posts: 2488
Joined: 2011-12-10 09:13am

Re: Watch Dogs Graphics Downgraded on PC

Post by Grumman »

"Moar power!" is still useful, even if you're not just spending it on high polygon models and high resolution textures. Increased draw distance and increased maximum density of objects can both improve gameplay and the feel of the world more than just making more detailed individual entities can, without requiring more development time. Sandbox games like Watch_Dogs benefit when high speed chases don't start running up against the game's pop-in distance, and games with aircraft or sniper rifles benefit when your target still exists at engagement range.
User avatar
TheFeniX
Sith Marauder
Posts: 4869
Joined: 2003-06-26 04:24pm
Location: Texas

Re: Watch Dogs Graphics Downgraded on PC

Post by TheFeniX »

Mr Bean wrote:I think both of you are talking past the point, the problem is not the fact the Xbone and the Ps4 have two year old mid-tier GPU's the problem is they have a four year old laptop grade CPU which is prevent the GPU from doing it's job.
Yea, this is kind of a tangent we're on. I just keep hearing about skyrocketing development costs and publishers/marketer are telling me it's all these super-cool graphics and an extra 30FPS that's a large part of it. But it doesn't add up, if anything it has become much cheaper to crank out high-resolution and high-poly graphics than it ever has been. Graphics and 3D modelling programs have come a long way. The problem is getting them into the game, not the graphics themselves.

Numbers are always hard to come by, but the most cost-prohibitive parts of development, at least from everything I've read: are marketing, voice-actors, animation, and other actually time consuming tasks. Marketing has become an industry in of itself and it sometimes doubling the production cost. Saying it's all due to graphics doesn't add up.
User avatar
TheFeniX
Sith Marauder
Posts: 4869
Joined: 2003-06-26 04:24pm
Location: Texas

Re: Watch Dogs Graphics Downgraded on PC

Post by TheFeniX »

Not willing to let the idiocy slide, Ubisoft thinks the industry is going away from 60fps:
Viva la resolución! Assassin's Creed dev thinks industry is dropping 60 fps standard Just having some French Revolution lolz

Assassin's Creed Unity is the first game in the franchise to be built from the ground up for the new-gen consoles, and Ubisoft has confirmed it will run at 30 frames-per-second, and render at 900p, on both PS4 and Xbox One.

This week, gamers accused Ubisoft of keeping the frame rate and resolution down to these numbers to avoid the PS4 having a graphical advantage, but Nicolas Guérin, World Level Design Director on Unity, told TechRadar that the decision was partly to give the game more of a cinematic gloss - though did admit that it was also tough to achieve.

"At Ubisoft for a long time we wanted to push 60 fps. I don't think it was a good idea because you don't gain that much from 60 fps and it doesn't look like the real thing. It's a bit like The Hobbit movie, it looked really weird.

"And in other games it's the same - like the Rachet and Clank series [where it was dropped]. So I think collectively in the video game industry we're dropping that standard because it's hard to achieve, it's twice as hard as 30fps, and its not really that great in terms of rendering quality of the picture and the image."
What's in a number?

Alex Amancio, the game's Creative Director, reiterated this point: "30 was our goal, it feels more cinematic. 60 is really good for a shooter, action adventure not so much. It actually feels better for people when it's at that 30fps. It also lets us push the limits of everything to the maximum.

"It's like when people start asking about resolution. Is it the number of the quality of the pixels that you want? If the game looks gorgeous, who cares about the number?"

This week, Ubisoft came out to say that it did not lower the specs of Unity to account for one system over another, after Senior Producer Vincent Pontbriand hinted to VideoGamer that this might not have been the case.
There is just nothing about this article that doesn't make me roll my eyes. The idea that 30FPS is superior to 60FPS in the vein of anything, much less action-adventure is mind-boggling. We've hit all the points in other areas: "programming is haaaaard!," "CINEMATIC EXPERIENCE," and also managing to bring up the fucking Hobbit.

It's 2014, and we're debating the merits of 30FPS vs 60FPS. This debate honestly did not exist before we entered bizarro world.
bilateralrope
Sith Acolyte
Posts: 6168
Joined: 2005-06-25 06:50pm
Location: New Zealand

Re: Watch Dogs Graphics Downgraded on PC

Post by bilateralrope »

I think a lot of developers are trying to move to 30 FPS to improve the quality of their screenshots and youtube videos. Because marketing wants screenshots improved and they don't have any spare hardware capacity left. So framerate is dropped. Then, when they can't drop framerate any further, they lower the screen resolution. I wonder how low screen resolution can drop before it hurts sales.

As for the "cinematic experience", has that ever been used to justify adding a good aspect to video games ?
User avatar
DaveJB
Jedi Council Member
Posts: 1917
Joined: 2003-10-06 05:37pm
Location: Leeds, UK

Re: Watch Dogs Graphics Downgraded on PC

Post by DaveJB »

TheFeniX wrote:It's 2014, and we're debating the merits of 30FPS vs 60FPS. This debate honestly did not exist before we entered bizarro world.
If they really want a "cinematic experience" they should just lock their game engines at 24FPS. After all, most modern TVs have 24Hz modes for playing Blu-Rays with, so may as well go all-out, right?

To be fair, Ubisoft are the only major company I've seen trying to pull this argument; the only similar thing I've heard of was on the PS4 version of The Last of Us, where they included an optional frame limiter for people who prefer the look of 30fps. Other than that, it seems like most developers are prioritizing 60FPS over 1080P, hence the resolution disparities between the PS4 and Xbone.
User avatar
TheFeniX
Sith Marauder
Posts: 4869
Joined: 2003-06-26 04:24pm
Location: Texas

Re: Watch Dogs Graphics Downgraded on PC

Post by TheFeniX »

DaveJB wrote:
TheFeniX wrote:It's 2014, and we're debating the merits of 30FPS vs 60FPS. This debate honestly did not exist before we entered bizarro world.
If they really want a "cinematic experience" they should just lock their game engines at 24FPS. After all, most modern TVs have 24Hz modes for playing Blu-Rays with, so may as well go all-out, right?

To be fair, Ubisoft are the only major company I've seen trying to pull this argument; the only similar thing I've heard of was on the PS4 version of The Last of Us, where they included an optional frame limiter for people who prefer the look of 30fps. Other than that, it seems like most developers are prioritizing 60FPS over 1080P, hence the resolution disparities between the PS4 and Xbone.
It's out there, Ubisoft is just the loudest about it.
Now, speaking with Kotaku, Jan said of frame-rates, “60 fps is really responsive and really cool. I enjoy playing games in 60 fps. But one thing that really changes is the aesthetic of the game in 60 fps.

“We’re going for this filmic look, so one thing that we knew immediately was films run at 24 fps. We’re gonna run at 30 because 24 fps does not feel good to play. So there’s one concession in terms of making it aesthetically pleasing, because it just has to feel good to play.”
Basically, looking like a movie jam-packed with motion-blur and DoF is more important than responsive gameplay. But it's just hilarious to see someone admit that actually gaming at 24FPS is garbage, but 30FPS is ok, but 60FPS would destroy their "look."
bilateralrope
Sith Acolyte
Posts: 6168
Joined: 2005-06-25 06:50pm
Location: New Zealand

Re: Watch Dogs Graphics Downgraded on PC

Post by bilateralrope »

You forgot to mention that The Order: 1886 also has black bars at the top and bottom of the screen for a 'cinematic' aspect ratio.
User avatar
salm
Rabid Monkey
Posts: 10296
Joined: 2002-09-09 08:25pm

Re: Watch Dogs Graphics Downgraded on PC

Post by salm »

Since mainstream VR might be around the corner and VR doesn´t work with framerates lower than 60 (and 75 or even 90 is recommended) this whole problem could be dealt with sooner or later.
Dread Not
Padawan Learner
Posts: 264
Joined: 2006-06-23 11:41pm

Re: Watch Dogs Graphics Downgraded on PC

Post by Dread Not »

"And in other games it's the same - like the Rachet and Clank series [where it was dropped]. So I think collectively in the video game industry we're dropping that standard because it's hard to achieve, it's twice as hard as 30fps, and its not really that great in terms of rendering quality of the picture and the image."
What's in a number?
Bahahaha, what horse shit. I've been playing the most recent :roll: "Rachet" :roll: & Clank, and the slower framerate hurts the experience baaaaad. I'm still enjoying myself, but with all the destruction and chaos that happens on screen in a Ratchet game, 30 fps significantly diminishes the pleasant feedback found in blasting swarms of enemies and seeing the bolts fly out of them and get sucked up toward you. The game also had a code for R&C: Quest for Booty which I never bothered with, but it still runs at 60 fps, and the difference going from one game to the other almost made my eyes pop out of my head.

And I have never played a PC game at 60 fps and thought "This framerate is really killing the cinematic experience of this game for me. I think I'll lock it to 30." I'm highly skeptical anyone else ever has either.
User avatar
Lord Revan
Emperor's Hand
Posts: 12236
Joined: 2004-05-20 02:23pm
Location: Zone:classified

Re: Watch Dogs Graphics Downgraded on PC

Post by Lord Revan »

yeah especially as 30 fps on a video game actually doesn't look the same as 24 fps on a movie film not even close, more often then not "cinematic experience" means "we couldn't be arsed to make a good game so here's some grafics that look good in static screenshots to hide that fact", I mean WoW dispite being based on 15 or so year old engine (it uses a modified engine they developed for SC:ghost) often plays better then alot of games that are so-called "cinematic experience" games simply cause instead of trying to use the best possible graphics the hardware at the time could handle Blizzard used graphics that worked to give you the best gameplay experience possible.
I may be an idiot, but I'm a tolerated idiot
"I think you completely missed the point of sigs. They're supposed to be completely homegrown in the fertile hydroponics lab of your mind, dried in your closet, rolled, and smoked...
Oh wait, that's marijuana..."Einhander Sn0m4n
User avatar
salm
Rabid Monkey
Posts: 10296
Joined: 2002-09-09 08:25pm

Re: Watch Dogs Graphics Downgraded on PC

Post by salm »

Lord Revan wrote:yeah especially as 30 fps on a video game actually doesn't look the same as 24 fps on a movie film not even close, more often then not "cinematic experience" means "we couldn't be arsed to make a good game so here's some grafics that look good in static screenshots to hide that fact", I mean WoW dispite being based on 15 or so year old engine (it uses a modified engine they developed for SC:ghost) often plays better then alot of games that are so-called "cinematic experience" games simply cause instead of trying to use the best possible graphics the hardware at the time could handle Blizzard used graphics that worked to give you the best gameplay experience possible.
A 24 fps movie naturally has motion blur applied whereas a game does not unless you artificially add it with some post effect. That´s the reason why movies look good at 24 fps.
If they really want to go for a "cinematic experience" they really do have to lower the frame rate and apply good motion blur (there are plenty of examples of bad motion blur, btw).

Now, if you like "cinematic experineces" in games or not is a different question but if they are honestly going for that look and find it more important than certain other things which would favour 60 or more fps then they´re doing the logically consequent thing.
User avatar
Vendetta
Emperor's Hand
Posts: 10895
Joined: 2002-07-07 04:57pm
Location: Sheffield, UK

Re: Watch Dogs Graphics Downgraded on PC

Post by Vendetta »

Dread Not wrote: And I have never played a PC game at 60 fps and thought "This framerate is really killing the cinematic experience of this game for me. I think I'll lock it to 30." I'm highly skeptical anyone else ever has either.
Depends how distracting you find screen tearing.

Some people might prefer a stable frame rate that syncs with their monitor's refresh to eliminate tearing to a variable one which generates tearing.

Frame rate affects different games differently as well, the more precisely the input has to match what's on screen (competitive FPS, fighting games including spectacle fighters, racing games, and rhythm games) the more important a high frame rate is. For third person games with combat with much looser timings (or just slow combat, Dark Souls works at 30FPS because it's slow and relies on anticipation more than reaction, and many people cope with low FPS in World of Tanks because that's also slow and based on anticipation not reaction) it's generally less important.

I would say that higher framerates are more important on PC as well for the simple reason that you are usually sitting closer to the display, so the effects of frame rate on animation are more noticable than they would be on a screen on the other side of the room. (Also the reason why first person games on PC tend to need wider FoV, because the screen is physically closer)

So y'know, 30FPS on a console third person action game isn't going to have the same effect on the game experience as it would on a PC FPS. Sure, 60FPS would be better for both games, but dropping to 30 is less worse in some situations (and still doesn't push input lag beyond the point that most people start to get distracted by it, though some people still might)

(Also: The Order: 1886 is just Baby's First Bloodborne....)
User avatar
TheFeniX
Sith Marauder
Posts: 4869
Joined: 2003-06-26 04:24pm
Location: Texas

Re: Watch Dogs Graphics Downgraded on PC

Post by TheFeniX »

salm wrote:Since mainstream VR might be around the corner and VR doesn´t work with framerates lower than 60 (and 75 or even 90 is recommended) this whole problem could be dealt with sooner or later.
I doubt VR is going to achieve mainstream status. Switching from a TV/Monitor, which you need to use your devices for pretty much any other task, over to single-user goggles isn't really going to cut it.

Anyways, the sad part is motion-blur merely covers up the jankyness with 30FPS, but is actually pretty cool when done right in a 60FPS game. I'm pretty sure Crysis had motion blur. I know Portal did. For a 30FPS game it can either be done well enough that you barely notice the framerate (Gears of War) or poorly enough that the game still plays clunky (Force Unleashed). Force Unleashed had the extra problem of the game wanting to go slow-mo every five seconds, which really highlighted the frame-rate as boxes and NPCs bounced around the room.

Alternatively, the slow-mo portions of Jedi-Outcast and Academy didn't have this problem for obvious reasons. I had so much experience with lightsabers at 60FPS (and even 75FPS during one of my spells with an old LCD). Going from that to 30FPS QTE fest of TFU made me think the game was suffering performance problems.
User avatar
salm
Rabid Monkey
Posts: 10296
Joined: 2002-09-09 08:25pm

Re: Watch Dogs Graphics Downgraded on PC

Post by salm »

TheFeniX wrote:I doubt VR is going to achieve mainstream status. Switching from a TV/Monitor, which you need to use your devices for pretty much any other task, over to single-user goggles isn't really going to cut it.
I disagree. I think these problems are solvable. People seem to love VR (at least as soon as they´ve tried it), even non gamer people, so there will be quite a market and there´s allready a lot of money in the business. Furthermore it´s not just a graphical gimmick but actually adds plenty of gameplay value.

Anyways, the sad part is motion-blur merely covers up the jankyness with 30FPS, but is actually pretty cool when done right in a 60FPS game. I'm pretty sure Crysis had motion blur. I know Portal did. For a 30FPS game it can either be done well enough that you barely notice the framerate (Gears of War) or poorly enough that the game still plays clunky (Force Unleashed). Force Unleashed had the extra problem of the game wanting to go slow-mo every five seconds, which really highlighted the frame-rate as boxes and NPCs bounced around the room.

Alternatively, the slow-mo portions of Jedi-Outcast and Academy didn't have this problem for obvious reasons. I had so much experience with lightsabers at 60FPS (and even 75FPS during one of my spells with an old LCD). Going from that to 30FPS QTE fest of TFU made me think the game was suffering performance problems.
Oh, sure, the 30 FPS + MB will only cover up the stuttering. Just like in a movie. It does nothing for latency. Some games, like Vendetta pointed out, don´t really need that good latency, though, so 30 FPS + MB might be a decent solution.

Personally I prefer higher frame rates to some "cinematic experience" bullshit, though.
Post Reply