Progressive VS Interlaced - Why Ain't Interlaced More Popular?

Avatar image for seppli
Seppli

11232

Forum Posts

9

Wiki Points

0

Followers

Reviews: 7

User Lists: 0

#1  Edited By Seppli

I've recently come to wonder why developers desperate for frames don't render interlaced instead of progressive - alternating which lines of pixels gets rendered each frame. As far as my layman's knowledge goes, that could substantially increase framerates at a relatively minor loss of image quality, respectively allow to render in a higher resolution and maintain sufficient framerates. Playing on a native 1080p panel, 1080i does look worlds clearer to me than upscaled 720p - almost as good as 1080p.

Would I rather play a game that's rendered at 30 frames per second in 720p? Or at 60 frames per second in 720i? Or at 30 frames per second in 1080i? The latter two options do sound much more enticing to me. On PC, for hardware intensive games like Battlefield 3 and Crysis 2, I'd absolutely love an option to switch to interlaced rendering to potentially double my framerates.

Anybody got a clue on why console games don't opt to render an interlaced image, and why it ain't an option in most PC games? And where do you guys stand in the matter? Would you prefer more frames or a higher rendering resolution over the common progressive method of rendering every line of pixels in every frame?

Avatar image for enigma777
Enigma777

6285

Forum Posts

696

Wiki Points

0

Followers

Reviews: 0

User Lists: 8

#2  Edited By Enigma777

I'm guessing it's because the TV requires a full frame's info regardless on how it chooses to draw it out, so there's no way to take advantage of it since its something on the TV side of things and not the console/PC side. But I don't know jack shit about this sort of stuff so...

Avatar image for justin258
Justin258

16684

Forum Posts

26

Wiki Points

0

Followers

Reviews: 11

User Lists: 8

#3  Edited By Justin258

There's probably a good technical reason; either that or "720p" just sounds better because it's more familiar to people.

But yes, if that gives me better performance in my games I would much prefer it.

Avatar image for kindgineer
kindgineer

3102

Forum Posts

969

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#4  Edited By kindgineer

No idea, I barely understand that there is a difference. I just want better quality, but it would seem "Progressive" is known more so it will most likely stay. It still boggles my mind that the ps4 will support 4k...will that be that relevant in the next 5-6 years?

Avatar image for humanity
Humanity

21858

Forum Posts

5738

Wiki Points

0

Followers

Reviews: 40

User Lists: 16

#5  Edited By Humanity

@ccampb89 said:

No idea, I barely understand that there is a difference. I just want better quality, but it would seem "Progressive" is known more so it will most likely stay. It still boggles my mind that the ps4 will support 4k...will that be that relevant in the next 5-6 years?

It's actually very relevant. Current generation SONY digital projectors for movie theatres are 4k and capable of 3D picture as well. The ability to have that fine detail in your home, while a bit overkill for even a 50" TV is kind of awesome.

Avatar image for kindgineer
kindgineer

3102

Forum Posts

969

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#6  Edited By kindgineer

@Humanity said:

@ccampb89 said:

No idea, I barely understand that there is a difference. I just want better quality, but it would seem "Progressive" is known more so it will most likely stay. It still boggles my mind that the ps4 will support 4k...will that be that relevant in the next 5-6 years?

It's actually very relevant. Current generation SONY digital projectors for movie theatres are 4k and capable of 3D picture as well. The ability to have that fine detail in your home, while a bit overkill for even a 50" TV is kind of awesome.

Ah, I see. However, will it be relevant enough for it to be a "Pro" check-box when it releases, or even 3 years down the road. It's fantastic that movie theaters are using it, but if the common household, or at least upper-class cannot afford to use it in their own houses afford-ably, it seems like a useless addition. I don't know, I just know that many people are still licking their wounds from purchasing an HD television at $1,500+ - just seems a little soon to be pushing another graphical generation for such a non-hardcore audience as console media enthusiasts.

Avatar image for mnemoidian
Mnemoidian

1016

Forum Posts

478

Wiki Points

0

Followers

Reviews: 0

User Lists: 26

#7  Edited By Mnemoidian

@ccampb89: well, consider BluRay - it's a way for Sony to create a secondary saturation of that technology. If they sell a couple of million PS4's with 4k support, then there's an established userbase with access to the technology. When they upgrade their display, they will possibly be moreinclined to get get one with 4k support. ie: it creates secondary business, that they hope to catch with their Bravia division.

@Seppli: There are a bunch of articles on Wikipedia about interlace vs progressive scanning.

http://en.wikipedia.org/wiki/Progressive_scan

http://en.wikipedia.org/wiki/Interlaced_video

But effectively, the main benefit of Progressive is that you don't as much artifacts on images that are quickly moving horizontally.

Avatar image for humanity
Humanity

21858

Forum Posts

5738

Wiki Points

0

Followers

Reviews: 40

User Lists: 16

#8  Edited By Humanity

@ccampb89 said:

@Humanity said:

@ccampb89 said:

No idea, I barely understand that there is a difference. I just want better quality, but it would seem "Progressive" is known more so it will most likely stay. It still boggles my mind that the ps4 will support 4k...will that be that relevant in the next 5-6 years?

It's actually very relevant. Current generation SONY digital projectors for movie theatres are 4k and capable of 3D picture as well. The ability to have that fine detail in your home, while a bit overkill for even a 50" TV is kind of awesome.

Ah, I see. However, will it be relevant enough for it to be a "Pro" check-box when it releases, or even 3 years down the road. It's fantastic that movie theaters are using it, but if the common household, or at least upper-class cannot afford to use it in their own houses afford-ably, it seems like a useless addition. I don't know, I just know that many people are still licking their wounds from purchasing an HD television at $1,500+ - just seems a little soon to be pushing another graphical generation for such a non-hardcore audience as console media enthusiasts.

You're absolutely right but SONY's business model has always been to push new hardware on people. When the PS3 was coming out Blu Ray wasn't popular at all. I'd say even today Blu Ray isn't THAT popular but it's night and day from like 5 years ago. Their big sellings points have always been tech that will be important in a few years. The 3D craze didn't catch on as much so maybe super 4K high definition TV's will be a new craze - although it will suffer from the same limitations we've been seeing with HD TV's where your TV might be able to render beautiful picture but your cable provider or whatnot will not be sending you those signals. I remember the shock everyone went through buying HD TV's only to realize non HD channels cable looked worse than on their old TVs.

I think anything new and innovative is a pro check-box. If the next 360 has the ability to act as a wireless network hub for my PC I might not care, but it's cool it might do that I guess.

Avatar image for kindgineer
kindgineer

3102

Forum Posts

969

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#9  Edited By kindgineer
@Humanity Makes sense, but if I know businesses, it will only cause the console to rise in price. I feel it almost has to be a pure rumor do to the feedback Sony received when they first released information on the Ps3. Can Sony even take a hemorrhage of funds like that again?

I just remember when the Xbox 360 was first pushed and everyone was crazed on the idea of HD. When I purchased mine, I still had a 32" tuber and aside from a couple of small text mistakes from companies, it wasn't such a big deal back then. Now we are still in the baby stages of HdMI (like you said, BluRay still isn't the biggest platform, just the only one) and I feel it would be too early for sony to push a new graphical "revolution" without alienating customers with its price point.
Avatar image for jack268
Jack268

3370

Forum Posts

1299

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#10  Edited By Jack268

I prefer progressive. Changing from 480i to 480p on my Wii was a godsend. There was horrible ghosting before on most games.

Avatar image for sins_of_mosin
sins_of_mosin

1713

Forum Posts

291

Wiki Points

0

Followers

Reviews: 27

User Lists: 7

#11  Edited By sins_of_mosin

Progressive is better, thats why everything is 720P or 1080P and not I.

Avatar image for dogma
Dogma

1018

Forum Posts

34

Wiki Points

0

Followers

Reviews: 0

User Lists: 3

#12  Edited By Dogma

If I don't remember totally wrong you have a higher risk of "ghosting" and getting delay with interlaced tech. The risk also goes up if the picture on the screen moves fast. We used to say in the store (I have sold TV:s) that it's always better to watch rapidly moving pictures with progressive screen rather than interlaced. Since the picture has to be redrawn twice before it's complete it becomes a higher risk of giving the impression that it's picture is crooked or blurry. We are after all talking about miliseconds and I sold TV:s during 2006-2008 so technology can have changed. Correct me if I'm wrong. But that was the main reason back then why P was preferred against I.

Avatar image for scrawnto
Scrawnto

2558

Forum Posts

83

Wiki Points

0

Followers

Reviews: 0

User Lists: 2

#13  Edited By Scrawnto

@Seppli: That's not how interlaced displays work. They don't actually render two different half frames. It takes one full frame and displays it one half at a time, splitting it over two refreshes. The only reason for a signal to be interlaced is if your screen is too slow to keep up with the rendering of the console. Interlaced displays are capped at 30 complete frames a second. They just display them as 60 half-frames a second.

There's also the matter of terrible artifacting that mentioned. It would be like V-sync issues, multiplied by the vertical resolution of your screen.

Avatar image for vonocourt
Vonocourt

2197

Forum Posts

127

Wiki Points

0

Followers

Reviews: 0

User Lists: 11

#14  Edited By Vonocourt

I don't think it wouldn't really matter I don't think, since you would render half the image every 1/60th of a second it would still equal out to rendering the whole frame every 1/30th. With interlacing artifacts included. But then again, I'm not an expert.

Also 720i doesn't exist, sure as a theoretical resolution, but no machine or standard accepts it.

Avatar image for mikkaq
MikkaQ

10296

Forum Posts

52

Wiki Points

0

Followers

Reviews: 0

User Lists: 2

#15  Edited By MikkaQ

Because it is hideous and leaves artifacts everywhere, it'd be taking a massive step back in terms of progress. It's a relic of old video systems, it was a cheat to get a higher frame rate out of the same signal. Now bandwidth for cable TV is certainly no longer an issue, and of course consoles don't have that limitation either, why wouldn't you use progressive scan?.

Also any benefit you'd get in added frame rate (if any) would be negated by the fact that you'd need a lot more anti-aliasing to make the image look acceptably good.

Avatar image for hoossy
hoossy

1075

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#16  Edited By hoossy

@ccampb89 said:

No idea, I barely understand that there is a difference. I just want better quality, but it would seem "Progressive" is known more so it will most likely stay. It still boggles my mind that the ps4 will support 4k...will that be that relevant in the next 5-6 years?

just because the Ps4 can support 4k doesn't mean games will be produced that can handle it, especially since console games aren't hitting the 1080p benchmark now. I'd imagine the 4k might be utilized for video... that's about it.

Avatar image for zidd
zidd

1940

Forum Posts

2905

Wiki Points

0

Followers

Reviews: 0

User Lists: 24

#17  Edited By zidd

@Jack268: Thats probably because you switched from composite to component cables to get 480p.

Most games don't even run at true 720p nowadays anyway. they need to get true 1080p in the next generation.