#1 Posted by defaulttag (890 posts) -

I'm going to buy a new HDTV for my bedroom. Does refresh rate affect the quality of your gaming experience on your HDTV? Just asking for a little input because there are so many options out there. I do know I want 1080p and at a screen size over 30. But other than that, all these other features confuse me. I have been to cnet, and read the articles on what these do, but they don't seem to provide info about how these features pertain to gaming, only movies. So do these features affect how the game looks/plays? Or is 60Hz good enough?

#2 Posted by Geno (6477 posts) -

Most screens that say they are more than 60hz are usually just 60hz with a bit of redundancy added in rather than actually having more unique frames per second. 60hz will be fine for quite some years.
#3 Posted by Scooper (7881 posts) -

60hz is totally fine.

#4 Posted by Sil3n7 (1198 posts) -

If you want to game in 3d you need at least 120. If you are going to spend alot of money anyways you might as well make it as "future proof" as possible.

#5 Posted by Evilsbane (4697 posts) -

60Hz Still gets the job done and looks great, 
 
120Hz takes it up a notch and just makes it look a little smoother and you can get TVs with 120 for pretty much the same price so this is what I would aim for. 
 
240Hz is very new very expensive and though it looks super clean it isn't worth it right now.

#6 Posted by Marz (5667 posts) -

60 hz basically = 60 fps  and most games are tuned to be at this setting,   Faster television sets only make it so that screen tearing is less apparent but you don't really get any more benefit than that.  Unless your TV has a gimmick like fast motion that sort of makes the movement look unnatural sometimes on the higher HZ tv's.

#7 Posted by PeasForFees (2411 posts) -
@defaulttag: To play games you will have to turn the motion to minimum due to the processing effects create more input lag. Which makes the games worse.
#8 Posted by EpicSteve (6495 posts) -
@Evilsbane said:
" 60Hz Still gets the job done and looks great,   120Hz takes it up a notch and just makes it look a little smoother and you can get TVs with 120 for pretty much the same price so this is what I would aim for.   240Hz is very new very expensive and though it looks super clean it isn't worth it right now. "
This.
#9 Posted by Maxszy (2067 posts) -

As others have said, 60Hz is just fine. Also depending on what screen size you are getting, you really should consider whether you are going to go 1080p like you say or 720p. If you are only going to go for 32 inches, I wouldn't really bother with a 1080 screen unless you are sitting fairly close to it. Really take in consideration with regard to how far you are going to be from your new HDTV. 
 
Here is a website that calculates the distance from your HDTV and at what point 1080p, 720p etc would be existent or non existent.  Also a graph, as visuals are nice too. Just keep these things in mind so when you actually go to buy your HDTV you can get the best bang for your buck. Unless money isn't really an issue, then just go for the "highest" because it doesn't hurt.

  http://myhometheater.homestead.com/viewingdistancecalculator.html 
 http://hd.engadget.com/2006/12/09/1080p-charted-viewing-distance-to-screen-size/  

#10 Posted by defaulttag (890 posts) -
@PeasForFees: On what refresh rate does this problem occur?
#11 Posted by Diamond (8634 posts) -

Unless you have a high end PC for gaming, there's no way you'll even be using over 60Hz anyways (well, 120Hz when the PS3 3D update comes out along with games that use it).  The 120Hz or 240Hz image processing TVs can have aren't really made for games, and generally don't really increase quality.  And this isn't CRT monitors where since the screen was flickering, it was better to have more than 60Hz for eye strain.
 
Can single DVI or HDMI 1.4 even update 1080p at 240Hz?  I don't know.

#12 Posted by Jack_Daniels (1416 posts) -
#13 Posted by PeasForFees (2411 posts) -
@defaulttag said:
" @PeasForFees: On what refresh rate does this problem occur? "
Any TV which you buy, Turn the TV settings to a Game Mode or PC mode, this cuts down/out motion processing and upscaling on the TV leading to better input lag. Leave it on and you can perform an input with a Mouse/Keyboard/Controller and see a noticable delay on the screen. Usually input lag happens on most TV's and is worsened by processing effects. If your still confused I can explain more, also what budget, and country are you in?
#14 Posted by Symphony (1912 posts) -

I just received a 60 hz 42 inch lcd TV for xmas and the motion blur issue is noticeable but not gamebreaking. Even in the xbox dashboard menus you can see the boxes playing "catch up" as you scroll back and forth between them. It's odd at first, but you get used to it. If you're going for crystal clear quality and don't mind spending a bit more money, I'd say go with 120hz (or maybe even a plasma, not sure they suffer the same issue)

#15 Posted by Seedofpower (3947 posts) -

I got 60hz and everything looks great!

#16 Posted by WilliamRLBaker (4779 posts) -

60hz
120hz is before its time, i've yet to see a movie not look like crap on 120hz every thing looks exaggerated.
60 hz is your best bet.

#17 Edited by Branthog (5595 posts) -

 *** NO TO GREATER THAN 60HZ ***
 According to personal preference and taste, television and movie content may be fine at 120hz and 240hz. Some people prefer it while others of us abhor it. It destroys that "film" feel and makes movies look like they were shot on a camcorder. In and of itself, it is not "better". In fact, most people I know do what I do -- we turn off the 120hz "TruMotion" or whatever each brand calls its version because it is distracting.
 
However, when it comes to gaming -- DO NOT USE IT. You want to be using 60hz for gaming. It'll be *optimal* at 60hz. Console games don't run faster than 60fps, anyway.  The 120hz and 240hz stuff adds nothing to the gaming experience and actually detracts from it. Everything from noticeable motion blur that can make some games entirely unplayable to simple added lag. If you're using an LCD television, you already have input lag. The last thing you want is *more* input lag. Using the 120hz and 240hz crap introduces more processing, which takes more time, which results in more input lag. It'll make playing rhythm games a bitch and become quite frustrating in multiplayer games like Modern Warfare and so on.
 
You need to sit down and watch some content (movie and television, atl east) on a screen using "Cinema" mode or other "120hz+" modes. Decide if you like it. Chances are you won't. If you don't like it -- you can buy a television that is just 60hz and you'll save a LOT of money over the 120hz version of the same television. If you like it enough to spend the extra money, go ahead and get it. Just make sure that you can turn it off when gaming. Setting a television to "gaming mode" will usually set the screen to 60hz mode and also disable all the additional processing that can bring additional lag (it won't eliminate lag as a couple frames -- which equates to at least 34ms lag on the best LCD -- is always there for the base processing).
 
While you're at it, you should also check what the "input lag" of your LCD screen is, if you buy one. Some are better than others. Samsungs can have some horrible lag in the range of six frames (a frame is about 16 or 17 ms of lag). That means that in the best of circumstances, such a television would have almost 100ms of lag (again, more noticable for some people than others, but rhythm games will demonstrate it the best). On the other hand, some screens like (believe it or not) the LG, has only three frames of lag. And if you get the LG**LH30 (not the LH40s or 50s), I believe they only have two frames. About the lowest to be found on consumer screens these days. You don't need to have the lowest lagging screen, but you obviously want to avoid the high lag screens if you can.
 
For the most part, the "120hz" and "240hz" stuff is a gimmick. In certain conditions, it can make certain content look a bit better (because of how that content is formatted for cinema and then home display). I kind of liked watching MOON at 120hz. It can make space scenes in BSG and other things look kind of awesome (because it smooths it out rather than looking fast and jaggy). Those few cases just aren't really worth it, though, in my opinion.
 
If you want to find out more information about input lag and "120hz" and gaming, I'd advise going to avsforum.com and doing a bit of reading. Ask questions if you're unclear on anything.
 
The main thing I'd hope you take away from my (long) post is that 60hz is *ideal* for gaming and the 120hz/240hz stuff is mostly a gimmick and only worth it on non-game content -- and even then only if you personally prefer it. You may very well find that you detest it and can't stand watching stuff with it turned on.
 
By the way, if you're looking for a reasonably priced LCD that would be pretty decent for gaming (though it isn't the absolute perfect picture-capable system if you're an intense videophile... which I am thankfully not), you might want to check out the LGs. I just boght one for my second entertainment center to compliment my main entertainment center. I focused on a gaming-centric system and went with a low-input-lag LG 42LH30 (the 42 is how many inches and the H30 is the version -- the 30 is the ideal for gaming as while the 40 has 120hz capability, it also has more input lag than the earlier 30 model -- which is a version a lot of gamers looking for deals have been going for).
 
Hope that helps, somewhat. Finding a television -- much less one ideal for gaming -- is becoming ridiculous. There are so many things you can get hung up on and so many pitfalls and it's easy to get something you'll regret or even just spend months not buying anything, because everything you look at is nitpicked apart by "videophiles" who know way more than you or I (and as a result, have a complaint about pretty much every system!).  

Good luck.

#18 Posted by Qorious (847 posts) -

60Hz is fine for most if not all games today. You will only see a slight decrease of lag if you bump it up to 120Hz or 240Hz. And it also depends on how fast your eyes can perceive lag as well. 

#19 Posted by Branthog (5595 posts) -
@WilliamRLBaker said:
" 60hz 120hz is before its time, i've yet to see a movie not look like crap on 120hz every thing looks exaggerated. 60 hz is your best bet. "
The only thing that looks interesting is the occasional computer generated space battle scene. Even then, it looks kind of awkward. All this 120hz and 240hz crap is doing is stuffing lots of frames that don't exist in-between those that do to "smooth" things out, which just ends up making them look unnatural -- because that's not how they're filmed by the content's creator. If movies were filmed at 60fps or 120fps, then that'd be different. I don't see that happening any time soon.
 
Videotape is recorded (I believe) at a faster speed than, say, film. You know how stuff shot on videotape has a sort of "weird" look and feel to it that feels cheap? You know, those few "video" episodes of Twilight Zone, the health and safety video at wherever you work, and Dark Shadows? That's basically what 120hz and 240hz gives you. It takes that classic "film" look that you get when you go to a theater and watch a great action flick or noir film -- and turns that $200,000,000 movie into something that looks like it was shot on a home camcorder...
#20 Posted by Th3dz (332 posts) -

It is always best to have the same setting on the TV as the games you're playing... 60hz with a 60 fps game will look best because that's the way it's meant to be played. It's pretty logical when you think about it! If you convert an mp3 file to an uncompressed wav it won't sound better!? Same deal here.

#21 Posted by Raven_Sword (3447 posts) -

 

Im Aiming for a 120hz for my Next TV. All Iknow is I need a new one because I have pretty bad Vertical banding on mine ight now.

#22 Posted by defaulttag (890 posts) -
@woltkezero: I will be playing PC and console games on it. So does 120Hz = better framerate?
#23 Edited by Diamond (8634 posts) -
@woltkezero: @defaulttag: 
You guys would have to look up if the LCD in question actually can update that quickly.  I know at least some (most?) LCDs will only update pixels at 60Hz max based on the source (ignoring the interpolation of image processing).
 
Either way there's a lot of question if you could actually perceive it anyways, I know personally I can't perceive FPS increases at all over about 80FPS in best case scenarios.
#24 Posted by ThePhantomnaut (6201 posts) -

I was watching Up with the large hz rates and looked so horrible.

#25 Posted by monkeyroach (175 posts) -

a 120hz tv cant really do 120fps its just showing 60fps twice.120hz  tvs a really only for watching movies (24fps) because it removes the judder in panning and moving scenes. A 120hz also helps with music games because it helps with ghosting.
 
I say go for it if you watch alot of movies and play alot of music games.

#26 Posted by defaulttag (890 posts) -
@ThePhantomnaut: I have seen pixar Blu-rays running at 1080p 24Hz which is only compatible with 120Hz, 240Hz or higher. IT LOOKS WEIRD!!! It looks like it is slightly being fast forwarded. Does this affect all movies running at those settings? Or is it because the way pixar movies are made?
#27 Posted by TheHBK (5554 posts) -

for games, it doesnt matter, you will play with the Auto Motion Plus or whatever they call it off.  It affects games too much, because they move fast and have HUDs so those blur a little do to the processing the TV does.  If you can get a tv with that, cool because I like using it for movies, makes em look smooth.  But for games, turn it off because its not good and can slow down response time.

#28 Posted by fallen_elite (380 posts) -

For only playing games a 60Hz TV is better because 120hz TVs introduce higher input lag due to the extra processing required in them. If you are sensitive to input lag and you don't watch movies then you are probably better off buying a 60Hz TV. If you are not sensitive to input lag, or don't know what it means, and/ or you watch movies then you should probably purchase a 120hz TV over a 60Hz.

#29 Posted by InfiniteStateMachine (215 posts) -

oh jebus they have 240 now? 
 
interesting notes on the 120 stuff for video games, I was wondering how a TV is supposed to magically create more frames. Is it just linearly interpolating the pixel color between 2 frames and inserting another one?

#30 Edited by SeriouslyNow (8534 posts) -
@Evilsbane said:

" 240Hz is very new very expensive and though it looks super clean it isn't worth it right now. "

High hertz rating TVs have three separate definitions :-
 
In plasma 600hz TVs it's essentially a lie based on motion compensation technology : They come up with the figure by breaking the screen calculations into ten separate blocks and say that each single block can refresh @ 60hz.  They go on to say that that this means motion will be smoother in fast motion scenes because it's rare that the whole of the screen needs to be updated at the same time.  This is obviously bullshit.  The screen is just a normal 60hz plasma panel coupled with a slightly clever motion engine which can dynamically update or ignore information which doesn't move. 
 
In some 240hz LCD TVs it's essentially just a 120hz panel with an LED back lighting system that flickers on and off at double the rate of the pixel refresh, which gives a 240hz "effect".  LG and Toshiba are two manufacturers who are selling 240hz effect TVs.  Avoid these unless they are priced close to other 120hz sets.
 
Proper 240hz LCD TVs actually have a 240hz panel and the proper motion compensation processor to match.  Samsung and Sony are releasing proper 240hz sets now and throughout the year.
 
The panel's refresh rate is properly related to your FPS rate if you game on PC, but it will probably max out at 120f/s (120hz) because no sets actually take input higher than 120hz due to limitations with the HDMI and Dual Link DVI standards.  Supposedly future DisplayPort (the most modern standard which can transfer video, sound AND power over one cable) revisions will not be limited to 60hz as it is right now.
#31 Edited by Three0neFive (2300 posts) -
@Branthog said:

" @WilliamRLBaker said:

" 60hz 120hz is before its time, i've yet to see a movie not look like crap on 120hz every thing looks exaggerated. 60 hz is your best bet. "

The only thing that looks interesting is the occasional computer generated space battle scene. Even then, it looks kind of awkward. All this 120hz and 240hz crap is doing is stuffing lots of frames that don't exist in-between those that do to "smooth" things out, which just ends up making them look unnatural -- because that's not how they're filmed by the content's creator. If movies were filmed at 60fps or 120fps, then that'd be different. I don't see that happening any time soon.  Videotape is recorded (I believe) at a faster speed than, say, film. You know how stuff shot on videotape has a sort of "weird" look and feel to it that feels cheap? You know, those few "video" episodes of Twilight Zone, the health and safety video at wherever you work, and Dark Shadows? That's basically what 120hz and 240hz gives you. It takes that classic "film" look that you get when you go to a theater and watch a great action flick or noir film -- and turns that $200,000,000 movie into something that looks like it was shot on a home camcorder... "
What this guy said. 120Hz doesn't mean shit if the content is only 60Hz (Which it still is. We aren't going to see true 120Hz on consoles until next gen at the very earliest, all signs currently point to more post-processing and interpolation rather than an upgrade in content standards), and in fact the extra frames are a massive detriment in any game that isn't at a constant 70 frames/requires split-second timing like CS or any other FPS.
 
If you're talking PC gaming though, if you're interested nVidia has a 3D package that comes with shutter glasses, a 120Hz monitor and some other things.
#32 Edited by SeriouslyNow (8534 posts) -
@InfiniteStateMachine said:

" oh jebus they have 240 now?   interesting notes on the 120 stuff for video games, I was wondering how a TV is supposed to magically create more frames. I s it just linearly interpolating the pixel color between 2 frames and inserting another one? "

Yes and  no.  There are multiple techniques involved.  The most common of these techniques are based around IVTC or 3:2 Pulldown.  Essentially motion compensation engines take a given frame count and converts that to the panel's hz rate with linear, logarithmic and other methods of interpolation of frames which are added inbetween key frames based on the original source.  The reason modern motion compensation don't rely solely on linear interpolation is that more often then not it creates an artificial smoothness which is abhorrent to the eye (distracting/uncanny valley effect) and it can create motion artifacts when sources use mixed framerate material - some David Lynch movies and TV shows which are shot on film and the converted to digital exhibit these problems.  
 
An even simpler example is TV News where the talking head presenter require a different type of motion compensation to the News Ticker which may be running at the bottom of the screen :  The Presenter needs more dynamic motion compensation, whereas the News Ticker will run at a fixed rate across the screen so linear interpolation is necessary. To that end motion compensation techniques have become more dynamic in their approach, with engines now capable of independently applying different techniques to screen "elements" before the entire scene is recombined into what displays on the panel.  This of course can and will lead to ms lag if it's not disabled when using an external input source such as a PC or games console and so many modern HDTVs will have some "Game" or "PC" or "1:1 Pixel Mapping"  (yes it's used to make sure that no overscan is in place but it usually also disables the motion compensator) setting to bypass the motion compensation aspects.
#33 Posted by InfiniteStateMachine (215 posts) -
@SeriouslyNow: So essentially it comes down to a set of fine tuning between interpolation techniques? Each company has a different method but at the end it comes down to deriving in-between frames from existing frame information?
 
What would happen in frame loss situations where the tenderer cant keep outputting at 60fps for various reasons?
#34 Edited by SeriouslyNow (8534 posts) -
@InfiniteStateMachine: 
 
Well they pretty much all derive their techniques from the single most commonly licensed API, that of the 3:2 Pulldown de-interlacing Faroudja DCDI.  Faroudja were the first company to sell HD 1080i rear projection sets in the US in 1992 at the alarming price of $35,000 USD.  Since then their technology in chip and API form has been licensed throughout the industry with each manufacturer adding their own techniques and upgrades as they go.  Loewe for example were the first company to offer dynamic frame interpolation to different scene elements with their Xelos CRT models which had the DMI (Dynamic Motion Interactive) feature on top of  their original DMM (Dynamic Motion Movement) motion compensator which was essentially turned any interlaced internal or external source at any framerate to 50hz progressive.  DMM often made text tickers skin or made people look like they were slightly sped up because the tweening wasn't dynamic enough - an apparent problem with the basic Faroudja DCDI on which it was based.  DMI is a better algorithm because as described above it deals out different interpolation types to different scene elements before delivery to the screen so people moved at the right speed and ticker text wouldn't skip frames.
 
Skipping or frozen frames is also a noticeable aspect of frame loss.  No motion compensator can really fix that.  The motion compensators which can't keep up with display device output generally don't make it to market in decent brands or are fixed in firmware patches after the fact.  Samsung LCD TVs for example, use embedded Linux to manage and designate their controls of the motion compensation engines which they have.   I've noticed some difference between versions and one was specifically made to fix compatibility issues with Australian Channel 9 HD who changed their bframe (key frame with inverted pallete) setting and this would actually 'hang' some Samsung sets entirely.
#35 Posted by InfiniteStateMachine (215 posts) -
@SeriouslyNow:  Very interesting. I was thinking maybe you could check for frame skips but the more I thought about it the less practical it seemed (larger frame buffer needed, would have to process further ahead when it's already stressed etc)
 
thanks for the info :)
#36 Posted by SeriouslyNow (8534 posts) -
@InfiniteStateMachine:  

No problemo.  Yeah, as you said larger frame buffer and having to read further ahead would have to apply and this approach doesn't work for Digital Television as effectively as it could for local HD sources such as BluRay or networked media players.  You'll notice that many HDTVs have access to a sort of "one remote controls all media delivery devices in the same generation from the same manufacturer" approach, well in these cases you can get a smoother experience because the motion compensators are tuned to deal with frame drops, network lag and so on from the devices of the same manufacturer.  For Samsung TVs they refer to this pooling of tuned devices and in TV hosted codecs as WISELINK, Anynet+ and DLNA, which allows the TV to play media and control devices through USB, RJ45 and HDMI 1.3 cables.
#37 Posted by defaulttag (890 posts) -
@InfiniteStateMachine: @SeriouslyNow:  NO FUCKING CLUE WHAT YOU GUYS ARE TALKING ABOUT!!!  Anyways, what type of TV is best suited for a gaming PC and a PS3. One running at the standard 60Hz or 120Hz? Or perhaps a 120Hz running in 60Hz mode? I need some info on my earlier question as well:
 
"I have seen pixar Blu-rays running at 1080p 24Hz which is only compatible with 120Hz, 240Hz or higher. IT LOOKS WEIRD!!! It looks like it is slightly being fast forwarded. Does this affect all movies running at those settings? Or is it just more prominent in 3D animated movies?"
#38 Posted by SeriouslyNow (8534 posts) -
@defaulttag: 
All 24p content is best enjoyed in proper 24hz mode.  All other modes, including 240z which is completely divisible by 24hz, will produced some motion interpolation artifacts where as you described scenes will look like they are being ever so slightly fast forwarded.  The reason why it's more prominent in Animated movies is due to the fact that they are rendered or cell painted to exact frame rates for film projection which is always 24 frames per second or 24hz.
#39 Posted by Branthog (5595 posts) -

I am astounded by the level of ignorance in here, proclaiming "yeah dudez!!! 120hz roxorz teh games!". It doesn't. In some cases, it may give you the impression of a better visual experience, but at the cost of additional input lag. This can make for some miserable experiences (such as in rhythm/music games, where the lag screws with the timing). There is no benefit for gaming and more often than not, is a detriment.
 
But don't take my word for it. Go dig up some discussions in well-known A/V forums such as avsforum. LCD televisions already have lag that CRT and Plasma televisions don't. The last thing you want to do is introduce even more by throwing on bullshit gimmick technology like "120hz" crap and increase the lag due to the processing.
 
Why in the hell do you think that the most primary function of any television's "gaming" mode is to TURN OFF 120hz?! 
 
But hey, whatever floats your boat. People convince themselves that because they spent a little extra money on a gimmick, it's the best thing in the world. I suspect these are also the people who run their televisions in "VIVID" mode (a huge no-no if you care about properly viewing the content on your screen and preserving the life of your television).

#40 Edited by CitizenKane (10508 posts) -
@Branthog: What about regular TV and movies.  Is it pretty much the same situation?
#41 Posted by defaulttag (890 posts) -
@Branthog: Is this a good choice?
#42 Posted by Branthog (5595 posts) -

 
@defaulttag said:

" @Branthog:
Looking at this TV:

     Thoughts?? "
Before I get on with my really long (sorry) reply, let me copy the links up to here for you so you don't have to search through everything for them:
 
 
  • HDTVs and Videogame Lag:  http://www.avsforum.com/avs-vb/showthread.php?t=558125 
  •  LCD Input Lag Benchmark Thread:  http://www.avsforum.com/avs-vb/showthread.php?p=16092353      
  •  Amazon has the LG 42LH30 for about $550, right now  http://www.amazon.com/gp/product/B001V5J7NY 
  •  http://en.wikipedia.org/wiki/Input_lag   

Samsungs have a reputation of having high input lag compared to other LCDs, but the benchmarks done on the thread (linked below) on AVSForum includes testing done on the A550, A650, and A750. The conclusions seem to be that the 60hz A550 is pretty good (I think it came out to 2 frames, which comes out to about 35ms of input lag -- pretty good for an LCD!).
  
LCD Input Lag Benchmark Thread:  http://www.avsforum.com/avs-vb/showthread.php?p=16092353   

The difference between the A550 and the later A650 and A750 models seems to mostly be that the later ones have 120hz. As I mentioned, you do not want 120hz for gaming. If you want it for movies is up to your own preference - but I'd say even if you like it, it's not worth the extra couple hundred bucks it'll usually cost. One other thing to point out if you find yourself drawn to the later models of that Samsung line -- I've read that they may not allow you to completely disable 120hz... which means it'd kind of suck for gaming. (Well, you'd introduce additional input lag, at least). The A550 is a 60hz television, so there's no worries there.
 
So, based on the specifications of that television and some of the discussion about it on avsforum, I'd say that it should be pretty good for gaming. It seems that 2 frames of lag is really good for LCDs, 3 frames of lag is good, and anything over that is pretty bad. You might not notice it in every game, but invite some frame-counting Street-Fighter IV competitive-level friend over to play on your screen and they'll probably get frustrated if you have a high input lag rate (since obviously it throws your timing off). Same for rhythm games.  
 
I have a high end home theater that I do a lot of gaming on, but I wanted to do a lot of gaming in my home office, too. So, I bought a second television. I obviously wasn't going to spend thousands of dollars on a secondary television, but I still wanted something I'd enjoy. For the price, I eventually settled on the LG 42LH40. Believe it or not, LG makes some decend LCDs at a very reasonable price range. Especially if you're not a ideophile constantly looking for that "perfect experience". You know, the guys who paint their entire room black, spend weeks blocking out every millimeter of sun and light in the room and pay hundreds of dollars to have an ISF technician come out and calibrate their display (or worse yet, own thousands of dollars of equipment so they can do it themselves!).
 
I gather you're like the rest of us -- that just want an enjoyable experience and aren't having panic attacks over black-levels in our television sets. In that case, I'd definitely advise the LG set for an affordable gaming rig that you can enjoy movies and stuff on. Only, I would advise getting the model just previous to the one I bought. The LG 42LH40 is great, but it's actually a little slower than the 42LH30 model. The primary difference between the two is that the 30 is 60hz an the 40 is 120hz. Since you can disable 120hz, that shouldn't be cause for additional lag, but something in the set does increase the lag over their older model. Testing shows that it has about 3 frames (or around 51ms) whiel the LH30 has 2 frames (around 34ms -- the same as the Samsung you're looking at has, I think).
 
A set with 3 frames of lag is pretty good. One with 2 frames lag is great. Some are awful - as high as 5 or 6. Even that might not seem like much, but think about it -- you might scoff at any network latency over 100ms when you're gaming. And then you're adding as much as another 100ms just from the input lag of the monitor. I haven't been bit first-hand, but I would imagine that puts you at a distinct disadvantage in games like, say, MW2. Where a fraction of a second can be the difference between you killing them or them killing you. 
 
So, based on the specs and the comments I've seen from owners/testers of the Samsung A550, I would say it sounds like a good set that you wouldn't regret buying for gaming. I can only advise it on those points, since I've never actually owned that model. Based on actual research and experience, I can personally reccomend the LG *LH30 line, though. I don't have any reason to sway you toward one or away from another, so I hope it doesn't come across as if I am. I just want to make it clear where I'm making comments based on experience and where I'm commenting just based on data.
 
The one thing I would balk at where that Samsung is concerned, is the price. Ouch. They want $750?! Amazon has the LG 32LH30 for about $550, right now. That's a bigger set for a couple hundred bucks less. And yes, I've bought big screen televisions from Amazon before, so don't be afraid to do that if you discover your best price on there. Even after paying shipping, it should come out ahead of other places sometimes.
 
Amazon has the LG 42LH30 for about $550, right now ( http://www.amazon.com/gp/product/B001V5J7NY ).
 
More than anything, I'd advise checking your future televisions out in person. While that won't tell you what kind of performance you'll get with videogames, it'll give you an idea of the picture. Settle on a few models that you like and then go into your local home theater store and take a look at them up close. Just remember that it won't be *exactly* the same as how it'll look when you get it home. The sets on display in a show room are set to a type of factory preset "vivid" mode specifically designed to make each set look awesome in a show room environment (they do this by doing such set-damaging things as cranking the colors and contrasts and brightness to ridiculous and set-wearing levels).
 
Also, keep in mind that you shouldn't be afraid to negotiate. I bought my 60" Sony SXRD while visiting my family in Portland about three years ago at Home Video Library. The set was going for about $4,000 and I found that i could get some good deals online. I went in, talked to the sales guy, told him "I'll buy this right now if you can make the same deal I found online, for $3,000". Walked out with it for the price I wanted it for. I'm not sure what your luck at a Best Buy would be with that, but . . . it's worth a shot if you find that is where the set you really want is.
 
And if you really have a hard time deciding on something, I'd suggest going to avsforum and asking for help or advice or thoughts on specific sets. That's where I went when I first started out about five years ago and they've been a great help in keeping me informed and avoiding some pitfalls.
 
If you really get your heart set on a system, don't let the difference between 2 or 3 frames of lag stop you. It's just an important thing to keep in mind when you are looking for a set. If you can get one with less lag, why not, right? And while 120hz is a nifty little trick to have on a television, it still largely remains a gimmick. The content you'll play on your screen will be no more than 60fps at best for games and 24 and 30 fps for television and movies. Getting this 120hz and 240hz stuff will smooth out the display of the movie and television content, but that isn't always a good thing. It often looks very unnatural and unnerving (all its' really doing is doubling -- or more -- the number of frames that are already there to fill supposed gaps). Some day, this might be a meaningful thing to have. I remain to be convinced. It actually took me several days of watching movies at 120hz before I could even tolerate it... and even then, it just made everything look like a cheap home made movie done on the cheap with a camcorder. If other people feel differently, that's cool. I mean, your eyes like what they like. Just don't be fooled into thinking it has any positive impact on your videogames.
 
 Hope you find something you like. I think both that Samsung and the LG should be pretty good. Don't hesitate to poke your head into the forums on that site, though. They've been really patient and helpful with me over the years. :)
#43 Posted by Branthog (5595 posts) -

" @defaulttag: Frame rate is determined by the Video Card, all the 120mhz do is smoth out the video by reducing BLUR....  Good for video games and TV... Some people dont like it on OLD movies cause they are 24 fps and not ....Just google it ...  But this is about Video Games, SO HELL YA its better.   LOL  "
Your video card can do 1000fps, but it's not going to matter to your television, which will be bound by its refresh rate. 
 
Also, it's not "old' movies that are 24fps. It's movies, period. Movie and television content is done at 24fps and 30fps. Movies have been shot in 24fps (that's why movies look like "movies") as the accepted standard for the last 90 years as have 99% (if not 100%) of the movies sitting on your shelf right now.
 
The reason people don't like 120hz and the reason it isn't necessarily "HELL YA IT'S BETTER" is that part of what gives movies that distinctly cinematic feel to them is that they're shot at 24fps. When you speed that up or smooth it out (essentially giving it the feel of content shot at a higher frame rate), you end up with a result that looks very much like it was shot on video; not film. You know when something has been shot on video and we've come to associate that "video" look with "cheap" and "low budget". You know, films made by high school students, work safety videos at the office, etc.
 
I like the idea of what it's trying to do, but I really feel it's pointless until this sort of thing is being done not by some transformation on your television set, but by the producers of the content in the first place. I don't know if the things Cameron and others are pushing right now in cinema will have any affect on that or not, but I think that it will be a very long time before people are willing to give up that "cinema" feel either as directors or consumers/viewers.
#44 Posted by RetroIce4 (4392 posts) -

120hz but I think most are at 60hz

#45 Posted by ThePhantomnaut (6201 posts) -
@defaulttag said:
" @ThePhantomnaut: I have seen pixar Blu-rays running at 1080p 24Hz which is only compatible with 120Hz, 240Hz or higher. IT LOOKS WEIRD!!! It looks like it is slightly being fast forwarded. Does this affect all movies running at those settings? Or is it because the way pixar movies are made? "
Well if it's a film running on a native 24 fps, it's likely to happen to anything else; animated films for me in particular. Although high refresh rates are cool and all, it can make a very tainted distraction which can lower the overall experience. Heck if Avatar had the issue and if possible had 3D, it just doesn't work.