4K: OLED TV or IPS LED Monitor?

Avatar image for giant_gamer
Giant_Gamer

1001

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

Poll 4K: OLED TV or IPS LED Monitor? (88 votes)

OLED TV 41%
IPS Monitor 15%
Don't waste your money on one of those. Go with the future and go VR! 0%
Results 44%

I won't lie.

Since Jeff bought an OLED 4K TV i started pondering the idea of going 4K myself and to my surprise i found a 40-inch 4K IPS monitors on the market!

Back then i had to buy a slow ass Smart TV because the IPS

monitors were too small for my living room. The biggest one was 30 inch and was the first 4K monitor made by ASUS. However, now i have the option to grant my past me wish.

The only that is stopping me though is the existence of OLED TV.

Are they better in image quality? Do they match IPS monitors in response time? Are they on the same level in performance?

 • 
Avatar image for dafdiego777
dafdiego777

302

Forum Posts

23

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

Which monitor are you looking at?

Avatar image for leftie68
leftie68

235

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#2  Edited By leftie68

Really depends. Is this going to be used as a console display, or a PC display? If it is for a PC, there really is no question, you should go for a very good IPS Monitor for a couple of reasons. 1) Response time is still noticeably better on the display and 2) refresh rate is, for most gamers, much more important than resolution. If your PC can push the FPS to require a higher refresh rate monitor, that is the way to go.

Of course all those benefits go out the window with most consoles, because of the limitation of the hardware.

Avatar image for zurv
Zurv

1041

Forum Posts

64

Wiki Points

0

Followers

Reviews: 0

User Lists: 8

OLED :) It is truly amazing! Having the true blacks are stunning.

My suggest is get last years "low end" model. Dan got his from ebay for under $1500.. which is pretty amazing. the B6 and most of the 2016 models are about 30ms-35ms in gaming mode. (there is an HDR gaming mode too.. which is a must. Jeff failed and didn't update him firmware for HDR gaming mode. bad jeff... bad! :) ) 4K and HDR freak'n rock.

the 2017 LG OLEDs (all OLED TVs are using LG screens btw) use the same screens as the 2016 TV. but the latency is lowers on the new TVs (20ms-25ms). For the cost i'd stay with the 2016 units. (E6 has 3d if that is your thing. It isn't for me. But passive 3d does look amazing on it. LG isn't added 3d to TVs anymore.)

I love these TVs. I've been upgrading very year for the past 4 years (ugh) from OLED to OLED from LG. Right now i'm rocking a 2016 G6 65" and a 55" C7.

Of course this is all a trap! once you get the screen you'll want a PC with video cards able to pump out "real" 4k.

RE7 looks amazing, crazy face Mass Effect does too in 4k/HDR.

worried about HMDI 2.1? yeah, that looks pretty cool.. but nothing is coming out this year with it.. so don't stress it.

Avatar image for giant_gamer
Giant_Gamer

1001

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

@dafdiego777: I don't have a monitor in mind but as soon as i saw a monitor like the one below, i was like that's it time to ask the duders.

https://www.amazon.com/gp/aw/d/B01E18XRY2/ref=mp_s_a_1_9?ie=UTF8&qid=1490200030&sr=8-9&pi=SL75_QL70&keywords=ips+monitor+4k

Avatar image for wynnduffy
WynnDuffy

1289

Forum Posts

27

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

I would not buy a non OLED TV in 2017 if you can afford it.

IPS is a bigger deal for a PC monitor where colour accuracy can matter more and so can viewing angles.

Avatar image for zurv
Zurv

1041

Forum Posts

64

Wiki Points

0

Followers

Reviews: 0

User Lists: 8

@giant_gamer: I'd check the input lag on that IPS thing ;) there are very few large "monitors" that are good for gaming. Also note that monitors doesn't have a very good scalers in them. So unless you are feeding them their native resolution, stuff isnt' going to look as good.

I'd looks at something more like this: https://www.amazon.com/Acer-Predator-XB321HK-bmiphz-Widescreen/dp/B01A3N60A2/ref=sr_1_1?ie=UTF8&qid=1490200917&sr=8-1&keywords=acer+4k+32%22+gsync

that is 32" and has gsync.

That said.. oled is freak'n amazing. An example. dishonored 2 is a real pretty game.. but going from my IPS monitor to the OLED TV... wow! having the contrast with true black and no bleeding light. That game when from .. good looking to amazing. OLEDs control light output per pixel. Back-lite screens, even ones with zones, light the full or part of the screen. light bleeding sucks. :(

Avatar image for giant_gamer
Giant_Gamer

1001

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

@wynnduffy: yeah this!

And the display is sharper even though it is using the same resolution as the TV.

Don't you feel like color accuracy weighs more than OLED's blacks?

Avatar image for zurv
Zurv

1041

Forum Posts

64

Wiki Points

0

Followers

Reviews: 0

User Lists: 8

#8  Edited By Zurv

@giant_gamer said:

@wynnduffy: yeah this!

And the display is sharper even though it is using the same resolution as the TV.

Don't you feel like color accuracy weighs more than OLED's blacks?

why do you think that monitor is better for color accuracy vs the TV? I'm guessing you are not paying someone $100s to calibrate whatever you get. Also, if you care about that you'd not be getting that IPS monitor.

Also don't forget HDR.. so yummy :)

(note the OLEDs can do 12bit color too.. in fact it is required for HDR.. 10 or 12... Also, great side viewing angles. Also 99% Color Gamut sRGB/Rec.709. Most LCDs are like 65%. I couldnt' find any info on the monitor you linked. )

maybe look at this for a monitor option - that has HDR and DCI-p3 95% Color Gamut.. sadly no gsync tho..

http://www.theverge.com/circuitbreaker/2017/3/17/14961720/lg-32ud99-price-release-date

(really.. big monitors aren't very good for gaming.. or most other stuff other than being big...)

Avatar image for eurobum
Eurobum

487

Forum Posts

2393

Wiki Points

0

Followers

Reviews: 0

User Lists: 1

I don't know anything about current OLED other than they use more power and burn out rather quick (lose their brighness and shift color), but seeing how VR uses OLED panels, this is the way forward, surely. However I would seek out the answer for why there are no OLED PC monitors before I buy anything.

Avatar image for dafdiego777
dafdiego777

302

Forum Posts

23

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

@eurobum: LG basically owns the patent on real OLED tech, and they've been pumping out as much as possible for their tv lineup.

Avatar image for wynnduffy
WynnDuffy

1289

Forum Posts

27

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#11  Edited By WynnDuffy

@zurv said:

@giant_gamer: light bleeding sucks. :(

It does, but it's not always a problem. I have an Acer XB271HU and I got very lucky in that my screen has practically no light bleed. The tiny amount of IPS glow it has is imperceptible unless I'm in a completely dark room.

The black levels are still bad compared to my plasma but the monitor is otherwise terrific, I got mine calibrated too. Very accurate, if it wasn't for my photo work I would have just left it as default settings. It probably has the best black levels I've ever seen on an IPS, horror games actually look good instead of crappy and grey.

Will be buying an OLED later this year, biding my time a little because I have to buy a bunch of stuff for my new apartment soon.

Avatar image for zurv
Zurv

1041

Forum Posts

64

Wiki Points

0

Followers

Reviews: 0

User Lists: 8

#12  Edited By Zurv

@wynnduffy said:
@zurv said:

@giant_gamer: light bleeding sucks. :(

It does, but it's not always a problem. I have an Acer XB271HU and I got very lucky in that my screen has practically no light bleed. The tiny amount of IPS glow it has is imperceptible unless I'm in a completely dark room.

The black levels are still bad compared to my plasma but the monitor is otherwise terrific, I got mine calibrated too. Very accurate, if it wasn't for my photo work I would have just left it as default settings. It probably has the best black levels I've ever seen on an IPS, horror games actually look good instead of crappy and grey.

Will be buying an OLED later this year, biding my time a little because I have to buy a bunch of stuff for my new apartment soon.

the acers are nice. I have the XB321HK and like it a lot. But when i talk about light bleed i didn't mean the normal type of uniformity or edge bleed, but because the light source for LCD is from behind and blasting all over. a black pixel is still giving off light. For the OLED it is like the screen is off for those pixels. If a black screen comes up on a game i couldn't even know that the OLED was on.. where it is pretty clear on the LCD.

honestly the only thing that really bugs me about the OLEDs is limits caused by HDMI 2.0. HDR requires 10 or 12bit color depth, but at 10 or 12bit you can't output a full (4:4:4) dynamic range. So for a HDR game you have to set 12bit 4:2:2. But in non-HDR games (to look the best) you'd want 8bit 4:4:4. This will be fixed with HDMI 2.1 (which won't be in anything till next yet.) or Display port 1.3(4).

Of course very very few monitors these day can do HDR and sadly none are OLED. But it is a pain to keep switching settings between games. Also, if anyone was wondering how consoles (s and pro) handle this? They do the changing for you. Are you playing HDR game? then 4:2:2 12bit. For a non-hdr game 4:4:4 8bit.

I always thought this was a useful pix. (for those that don't want to use "Full." Ii look forward to the day when we can do full and 10/12bit at the same time.

No Caption Provided

*note, for some reason the darn nvidia drivers always default to limited (ie, not 4:4:4) when connected to the LG. So, when you update drivers do make sure you are using RGB, 8bit, full :)

Avatar image for giant_gamer
Giant_Gamer

1001

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#13  Edited By Giant_Gamer

@zurv: First off, if i am upgrading TVs every year, i won't go ugh, i'd go YAY:D

Actually, one of the reasons that i am thinking of going to IPS monitors is the input lag. Most TVs have input lag around 0.5 sec which is a lot to take if i am playing a fast paced game or even using my desktop on them.

The Philips monitor that i have shown on my reply has a response time at 5ms but isn't the response time is the same thing as input lag? Because if it is then that's Fing incredible!

Monitors, as i am aware of usually comes with manual scalers so if the picture of a certain device isn't scaled well i can just manually scale it, or are you saying that they don't scale well with the absence of post processing?

Thank you for being informative because silly me, i thought that IPS monitors are 16-bits making them HDR ready :p

I am now seriously thinking about waiting for HDMI 2.1 OLED TVs. Because i would like to see HDR displayed at its full potential not to mention the faint hope that we get TVs with lower input lag next year.

Avatar image for eurobum
Eurobum

487

Forum Posts

2393

Wiki Points

0

Followers

Reviews: 0

User Lists: 1

@zurv: Seriously why should color accuracy matter for games of all things? I'm pretty sure all the latest GPUs use / started using color subsampling (4:2:?) to cut down texture sizes and free up memory bandwidth , which basically is the the biggest bottlenecks in high resolution rendering.

Wikipedia: To calculate required bandwidth factor relative to 4:4:4 (or 4:4:4:4), one needs to sum all the factors and divide the result by 12 (or 16, if alpha is present). WKPD

So let's compare (4+4+4)/12=1 and (4+2+0)/12=0.5. So we can halve bandwidth without a noticeable difference, done.

Just like with cameras we seem to be heading for more pixels and more compression, rather than "lossless" accuracy. More pixels = less aliasing >> color accuracy.

In moving pictures a screen pixel spends a small percentage of the time showing the intended color and the rest of the time it spends transitioning (color changing).

Besides isn't the HDR feature of TVs just as filter that compares several frames and merges them to create a picture with slightly higher color contrast, just as Jeff described on the Bombcast. And let's not forget that all digital TV broadcasting and VOD streams are mercilessly compressed. Even a TV that could display a 4:4:4 signal, where would you get it?

Avatar image for zurv
Zurv

1041

Forum Posts

64

Wiki Points

0

Followers

Reviews: 0

User Lists: 8

#15  Edited By Zurv

@eurobum said:

@zurv: Seriously why should color accuracy matter for games of all things? I'm pretty sure all the latest GPUs use / started using color subsampling (4:2:?) to cut down texture sizes and free up memory bandwidth , which basically is the the biggest bottlenecks in high resolution rendering.

Wikipedia: To calculate required bandwidth factor relative to 4:4:4 (or 4:4:4:4), one needs to sum all the factors and divide the result by 12 (or 16, if alpha is present). WKPD

So let's compare (4+4+4)/12=1 and (4+2+0)/12=0.5. So we can halve bandwidth without a noticeable difference, done.

Just like with cameras we seem to be heading for more pixels and more compression, rather than "lossless" accuracy. More pixels = less aliasing >> color accuracy.

In moving pictures a screen pixel spends a small percentage of the time showing the intended color and the rest of the time it spends transitioning (color changing).

Besides isn't the HDR feature of TVs just as filter that compares several frames and merges them to create a picture with slightly higher color contrast, just as Jeff described on the Bombcast. And let's not forget that all digital TV broadcasting and VOD streams are mercilessly compressed. Even a TV that could display a 4:4:4 signal, where would you get it?

sadly, i don't listen to the bombcast anymore because it is boring. :) but Jeff didn't even upgrade is TV yet for gaming HDR.. so he isn't a great source. Also, HDR content from network and amazon are pretty darn amazing. Also, I personally don't' care about shit sources like cable or VOD. My TV is mostly for gaming (and a little from my pointless UltraHD blurry player. Feed by a powerful PC.

Also don't confuse color depth and dynamic range. dynamic range is out shadow detail. a normal TV is "limited" - but not new ones and it would be crazy for an oled to gimp details in dark and white areas. More so when the source (PC, gaming consoles and blurays, steaming devices, etc..) Consoles will output 8bit full when not in HDR content. It is super clear when playing a game or desktop stuff the diff between limited and full.

(Also for bandwidth it is about 4k at 60hhz. 10bit (or higher) and full (4:4:4) won't work.)

Colors matter. one doesn't' have to be super picky like one would be in print media. But they should be close to reference colors. skin shouldn't look red or green for example. :) The good news on this is the ISF presets on the LG OLED are shockingly pretty darn good. That said, it doesn't have to be "perfect" for gaming either, but the OP was thinking that a monitor would be better for this - which is most cases isn't true and one still need to calibrate.

HDR isn't something funky. (unless someone turns it on for non-hdr content. Which is the same as fake 3d for non-3d sources.. and silly imo.) There is meta data sent per scene as part of the info sent to the display it isn't just a filter. (this is HDR10, Dolby Vision (which is now support by nvidia GPUs as of last driver) sends dynamic meta data. the LG OLED support HD10 and DV (and some other new standards). Netflix has some shows that do DV.

@giant_gamer: I'm suggest getting something now. :) here is my logic for it.

(well.. first... pixel refresh rate and Input lag are totally different. Most refresh rates listed on monitors are for pixel refresh rate (OLED is pretty much 0 pixel refresh rate. What most people talk about when talking about lag is input lag. Some monitors can get to about 10ms input lag, but those are normally pretty small screens. all under 27"-28") I think the general rule of thumb is under 50ms is.. "ok" - but getting over 30ms would be better :) (this is where the 2016 units sit... it is around 20 for the 2017 models.)

1) Were you planning to get the new 2017 models? ie, the cheapest being $3,000 for the lowest end 55"? Because IF they release HDMI 2.1 in products next year. That is the pricing you are going to look at. right now is a nice sweet spot for pricing. the screens are the same between the 2016 and 2017 models. (yes, some things are better in the 2017 model, but is that worth 2x the price to you? (it is for me.. but i'm a crazy person. As I've sad, I bought at least one new oled each year over the last 4 years. But I wouldn't replace a 2016 or a 2017.. hrmm unless that 77" comes other 10 grand... hrmm.. 77")

2) for hdmi 2.1 to work all your other devices need to be hdmi 2.1 too. video cards (i'm assuming we are talking PC gaming here and for consoles, do you think MS and sony are releasing a new units next year too?) and receiver. If the pattern follows from HDMI 2.0a(or b), nvidia waited over after everything else i owned was 2.0 before they offered a video card with it. (Only Pascal based cards are HMDI 2.0)

3) the limited/full stuff is real. But not every limiting. The consoles will auto switch between full for non-hdr and go limited for HDR. The extra data for HDR related do help limit the impact of "limited." Also, even with new TVs you still have to turn options on for a TV to take "deep color" (in LG speak) or PC color or full RGB on other devices. And have the sending devices have it turned on. My guess it most people don't even turn it on. So the limited/full doesn't matter. It very much looks like at san fan GB and their new Vizio TV. They did not turn it on and thus had problems with HDR. (hey jeff? did you turn on deep colors? it is per HMDI input and it is in the general option menu. :) )

The OLEDs at real amazing now. There will always be something better to come.. but in waiting for 2.1 that is at the earliest a year away (but may more) and will be costly. Also you are on a TV.. most likely with a controller. You really care about fps more than 60fps? I care.. on my desktop monitor.. but not my TV. Hell, i put vsync on my tv because i don't want to see the tearing and i sure as hell can do things fast with a controller anyway. :)

(oh jeff.. ask dan how to setup your TV.. shockingly he does know :) )

Avatar image for avantegardener
avantegardener

2491

Forum Posts

165

Wiki Points

0

Followers

Reviews: 0

User Lists: 11

@giant_gamer What are going to hook up to it, a PC or Console? Are you looking for a desktop solution or are looking for a new TV period?

Avatar image for csl316
csl316

16746

Forum Posts

765

Wiki Points

0

Followers

Reviews: 0

User Lists: 9

#17  Edited By csl316

Going full OLED seems like a good way to future proof for a few years.

Avatar image for zurv
Zurv

1041

Forum Posts

64

Wiki Points

0

Followers

Reviews: 0

User Lists: 8

#18  Edited By Zurv

@giant_gamer: a follow up on the question by @avantegardener, i'd suggest you wouldn't want to use this as a desktop. The TV will auto dim and there is no way to stop it in the TV settings. For a game or a movie this isn't a problem because the TV see content on the screen moving. But if you are just moving a mouse around or reading email. That most likely isn't enough to tell the TV that something is going on. I had a buddy try this when the OLED 4k first came out and he was very sad. :( His end fix was to have some moving images always running on part of this screen.

The dim doesn't kick in right away.. so reading a webpage or screwing around on the desktop for like 10-15min isn't a big deal, but this would be a deal breaker for most as a desktop.

The auto-dim is there to help prevent image retention (IR), Burn in isn't an issue (unless you have max brightness and leave something static on the screen or weeks. But IR can happen time to time and goes away, but the dimming helps. I came from plasmas where this was a much bigger issue, but even there it wasn't a big deal.

I leave a static desktop up or a paused game all the time.. sometimes for a really long time. I haven't really had a IR issue. The TV also does pixels shifting (which you can shutoff.. i haven't noticed it.) and after 4 hours of run time (cumulative run time) the tv will run a clean up magic-something :) when the TV is then next turned off.

Avatar image for eurobum
Eurobum

487

Forum Posts

2393

Wiki Points

0

Followers

Reviews: 0

User Lists: 1

#19  Edited By Eurobum

@zurv: To put it in most simple terms, HDR-video has little to do with either colors or their accuracy. What it changes is luminosity/brightness, which changes color in a way. Because it's a kind of post processing it's also faked (or at least can be), photo HDR is fakery and is something quite different from the advertised HDR feature.

HDR started with HDR photography, some smartphones have this feature, In an extreme example you take pictures of a city throughout the day from the same point of view, then an algorithm replaces the parts in the shadow of buildings with their brighter versions, as if the picture was lit by as many suns as pictures were taken. This creates a kind of super-picture which is very well lit, but actually not realistic at all. When using the sun its color spectrum also changes during a sunset and you can get changing colors.

No Caption Provided

An exteme example (Source) of HDR photography, which shows how not to do it and encourages restraint.

Anyway but back to televisions, in TVs HDR is not about colors - they still can't make more colors just combining 3 colored pixels anyway - it's about brightness, televisions have to have the eye-searing brightness to work even during daylight and across a living room. OLEDs got both brighter and darker, so they need a bigger scale/range for luminosity, thus was born high dynamic range video.

I guess you could either have an HDR source or the TV could interpolate finer degrees of brightness comparing subsequent frames (which isn't advisable for games). This makes some sense for really bright things, that would just be max brightness on a normal screen or really dark stuff which is even more important.

Anyway I'm just trying to make sense of it, with these things it's always hard to tell where technology ends and marketing BS begins. And certainly mislabeling this feature "HDR" is an attempt to invoke something common and recognizable. Gamma depth/range could be a more appropriate name maybe, which also would be familiar to anyone who played games on a Doom engine.

I believe Jeff might have referenced this (fluff) article: http://www.polygon.com/2016/9/9/12863078/ps4-pro-hdr-development

Avatar image for hippie_genocide
hippie_genocide

2574

Forum Posts

1

Wiki Points

0

Followers

Reviews: 0

User Lists: 2

4K gaming still seems quite a ways off to me. At least at framerates that I would consider acceptable. On consoles you can forget it. They are at least one whole hardware revision away from being able to play games at 4k resolution with solid 60fps. On PC, sure you could get a 4K monitor, but are you prepared to also buy the hardware to push all those pixels? If I were to buy an OLED tv (and I do want one) it wouldn't be for playing games on since they are kinda bad at that, other than occasionally to see how a game looked with HDR on or whatever.

Avatar image for zurv
Zurv

1041

Forum Posts

64

Wiki Points

0

Followers

Reviews: 0

User Lists: 8

@eurobum said:

@zurv: To put it in most simple terms, HDR-video has little to do with either colors or their accuracy. What it changes is luminosity/brightness, which changes color in a way. Because it's a kind of post processing it's also faked (or at least can be), photo HDR is fakery and is something quite different from the advertised HDR feature.

i didn't say HDR has to do with it - but a color depth of 10 or 12 is required for it. The OP brought up the plus for the IPS was color accuracy. Also, while HDR is cool, it is still super minor at this stage. few console games support it and like 3-4 PC games do. Also don't confuse HDR which is been around for a long time and this new HDR for games/movies. It isn't fudge, well.. lazy content producer could fudge stuff, and it is real meta data content being send to the display.

HDR is freak'n amazing for RE7, zero dawn and ME:A. But totally pointless on hitman and shadow warrior 2 (but everything about SW2 is pointless :) )

Side note, It was Dan and Vinny coming over to my apt and seeing OLED and real 4k PC gaming last summer (sadly no gaming HDR, but I did have those sexy LG demos :) ) that pushed Dan to get the OLED and go SLI.

@hippie_genocide lower rez content still looks good on the TVs. But as you said if one wants to spend the money, the PC power is there. But at the a cost more than the TV. 1440p content would still look amazing and some game on the 1080 ti (titan X (p)) do work at 4k. But I agree about the consoles (kinda.. zero dawn, with whatever fake 4k magic they are doing look god damn great.) Even the upcoming new xbox is still real weak compared to PC video cards. (But that will be a really great console for 1080p gaming!) Also while only 4k TVs have HDR, i don't think you need a 4k rez to do HDR.

Avatar image for zurv
Zurv

1041

Forum Posts

64

Wiki Points

0

Followers

Reviews: 0

User Lists: 8

newegg has a flash sale for this weekend on the 65" 2016 OLED (which is pretty much the same as the 2017 TVs

https://flash.newegg.com/Product/9SIA9KB5DH9628

That is a lot.. but less than the 55" 2017