I've recently come to wonder why developers desperate for frames don't render interlaced instead of progressive - alternating which lines of pixels gets rendered each frame. As far as my layman's knowledge goes, that could substantially increase framerates at a relatively minor loss of image quality, respectively allow to render in a higher resolution and maintain sufficient framerates. Playing on a native 1080p panel, 1080i does look worlds clearer to me than upscaled 720p - almost as good as 1080p.
Would I rather play a game that's rendered at 30 frames per second in 720p? Or at 60 frames per second in 720i? Or at 30 frames per second in 1080i? The latter two options do sound much more enticing to me. On PC, for hardware intensive games like Battlefield 3 and Crysis 2, I'd absolutely love an option to switch to interlaced rendering to potentially double my framerates.
Anybody got a clue on why console games don't opt to render an interlaced image, and why it ain't an option in most PC games? And where do you guys stand in the matter? Would you prefer more frames or a higher rendering resolution over the common progressive method of rendering every line of pixels in every frame?
Log in to comment