Film plays at 24fps.General Zod wrote:Film already plays at 30fps
They kind of have a point, in that film uses a long exposure period, creating a natural motion blur that preserves smoothness despite the abysmally low frame rate. It makes all the action scenes look blurry as fuck, but that's another issue.the amount of time it takes to create the frame is completely irrelevant to the issue of nausea.
CGI movies use temporal supersampling to do more or less perfect motion blur on all moving objects. Games are much, much more limited, using a few fake motion blur tricks on the most needy objects only (e.g. bullets, explosions, helicopter blades - or using an accumulation buffer to implement horribly laggy inter-frame motion blur). This is essentially why games need much higher frame rates than movies to look good. When it works this is great; 60 FPS+ frame rates are an inherently superior solution to even perfectly accurate motion blur. When it doesn't work it sucks, because a game running at 24 FPS looks much worse than a movie running at 24 FPS even before you consider the fact that (unlike movies) in most games you're trying to track moving targets and make precise control inputs based on anticipated positions.
The 'blurry sea of brown == headache' argument has some merit, essentially because when you're playing on a fixed 2D monitor your brain is forced to resort to backup methods for depth estimation (both stereo vision and head movement parallax being unavailable). Good edge detection and shape recognition are essential for that, so clean colourful models are indeed easier on the eyes. Theoretically if we all had 3D monitors with head tracking it wouldn't be an issue (convenient head tracking is IMHO the only decent use for Natal/Kinect).