Need some monitor advice: 1440p @60hz or 1080p @144hz?

Posted by Roxasthirteen (55 posts) 3 months, 2 days ago

Poll: Need some monitor advice: 1440p @60hz or 1080p @144hz? (59 votes)

1440p @60hz (Asus PB278Q) 49%
1080p @144hz (Asus VG248QE) 41%
Something else 10%

After building a PC (4690k and 290x Tri-X) I need to upgrade from my 1366x768 LG TV I'm using for now. I want to be able to use my 360/a next gen console if I get one on the same screen, so I need at least 1 HDMI port.

This is why I am not going with the overcloclable 1440p Korean monitors like BenQ, QNix, or Catleap, because all of them only have 1 DVI-D port. I have considered the ROQ Swift 144hz 1440p that will be released in a few months but I don't have that much to spend, plus I would have to switch to a nVidia SLI setup to fully take advantage of G-Sync.

I do like me my FPSs but I always play on wireless, and the campus wifi isn't always the best, so I will probably mostly be playing stuff like Tomb Raider, the 3 Witchers, Skyrim etc.

So I guess my question is this: As someone who has played alot of CoD on 360 at 60fps, which setup do you think I would see the biggest difference with?

#1 Edited by rm082e (205 posts) -

If you're not looking to become a high level FPS player, then 144Hz is just a visual effect. The added sensory input data isn't going to make any game better (that isn't twitch based) since you don't need pinpoint accuracy to play against the AI.

I have a pair of Dell 27" 2560x1440 screens at work and a Ben Q 27" 1920x1080 at home. The only real difference between them is when I am using applications on the desktop and sitting very close, I can see just a hit of the grid on the 27" 1080 monitor at home. In game though, the thing that matters is the resolution the game is running at, not the monitor. I down sample games on my GTX 770 from several different resolutions and I never notice the grid when playing.

If you are already sold on ASUS, they have several 27" 1920 x 1080 monitors for around $200. And if you stick with 1080, you will be able to use more of the horsepower in that 290X for pushing effects while maintaining a solid frame rate. If you have super sharp vision though, you may want to go ahead and spring for the 2560 x 1440, but I would definitely go with that over a high refresh rate monitor.

Also, just throwing this out there: If you have money to spend, you might want to look at an Ultra Widescreen 21:9 monitor. A lot of players who have tried Ultra Widescreen are saying it is the next step for gaming, rather than 4K. It's new though, so it's still relatively expensive.

Edit: After reading the replies to this post below, I see I misspoke. I intended to say: "The added sensory input data isn't going to make you better at any game (that isn't twitch based) since you don't need pinpoint accuracy to play against the AI." I didn't proof read my post closely enough. I can see how the original statement above is off base.

#2 Edited by MB (12923 posts) -

@rm082e said:

If you're not looking to become a high level FPS player, then 144Hz is just a visual effect. The added sensory input data isn't going to make any game better (that isn't twitch based) since you don't need pinpoint accuracy to play against the AI.

As someone who actually owns a 1080p/144hz monitor, I can absolutely tell the difference and it's much more than just a "visual effect". Higher FPS makes gameplay more responsive, reduces input lag, and provides a smoother experience. I can absolutely tell the difference between 120 & 60 fps, not so much between 120 and 144. I use my 1440p monitor when I'm doing anything else other than gaming.

To me there is much more benefit in gaming to go from 60fps to 120fps as compared to the benefit of going from 1080p to 1440p and staying at 60hz. Of course it's all personal preference, but having played games for the last few months at 120fps, I find it jarring to go back down to 60 or less.

Moderator
#3 Posted by CatsAkimbo (642 posts) -

@mb said:

@rm082e said:

If you're not looking to become a high level FPS player, then 144Hz is just a visual effect. The added sensory input data isn't going to make any game better (that isn't twitch based) since you don't need pinpoint accuracy to play against the AI.

As someone who actually owns a 1080p/144hz monitor, I can absolutely tell the difference and it's much more than just a "visual effect". Higher FPS makes gameplay more responsive, reduces input lag, and provides a smoother experience. I can absolutely tell the difference between 120 & 60 fps, not so much between 120 and 144. I use my 1440p monitor when I'm doing anything else other than gaming.

To me there is much more benefit in gaming to go from 60fps to 120fps as compared to the benefit of going from 1080p to 1440p and staying at 60hz. Of course it's all personal preference, but having played games for the last few months at 120fps, I find it jarring to go back down to 60 or less.

I have the Asus 144hz one, and yeah, it does make a noticeable difference for certain things. Even just using it on the desktop, you can drag windows around and the text is still clear and readable even while it's moving around (not that that's actually useful, but it does make everything feel super sharp and responsive).

However the color on it is merely "ok," and that's after a lot of tweaking. The default colors are awful and washed out. If you care about color (or you use photoshop at all), get something different.

#4 Edited by Amafi (916 posts) -

Depends. If your primary focus is games I'd go 144hz. If you do a lot of stuff at the computer other than games the extra pixels really help.

#5 Edited by newhaap (430 posts) -

I had this same question a while back, I went with the higher frequency. It is noticeable, but turns out not that many games support it, and those that do tend to be newer and require a beefy machine to be able to get a consistent 144 or even 120.

So I think you would get a more obvious/noticeable improvement more easily with the resolution upgrade.

EDIT: Oh and I agree with @CatsAkimbo about the color as well

#6 Posted by jArmAhead (341 posts) -

@rm082e said:

If you're not looking to become a high level FPS player, then 144Hz is just a visual effect. The added sensory input data isn't going to make any game better (that isn't twitch based) since you don't need pinpoint accuracy to play against the AI.

I have a pair of Dell 27" 2560x1440 screens at work and a Ben Q 27" 1920x1080 at home. The only real difference between them is when I am using applications on the desktop and sitting very close, I can see just a hit of the grid on the 27" 1080 monitor at home. In game though, the thing that matters is the resolution the game is running at, not the monitor. I down sample games on my GTX 770 from several different resolutions and I never notice the grid when playing.

If you are already sold on ASUS, they have several 27" 1920 x 1080 monitors for around $200. And if you stick with 1080, you will be able to use more of the horsepower in that 290X for pushing effects while maintaining a solid frame rate. If you have super sharp vision though, you may want to go ahead and spring for the 2560 x 1440, but I would definitely go with that over a high refresh rate monitor.

Also, just throwing this out there: If you have money to spend, you might want to look at an Ultra Widescreen 21:9 monitor. A lot of players who have tried Ultra Widescreen are saying it is the next step for gaming, rather than 4K. It's new though, so it's still relatively expensive.

I disagree, 144hz feels completely different. Have you actually used a 144hz screen for a long period of time? It may not be as "essential" to success, but it will ALWAYS feel a lot better than 60. It doesn't just look nicer, it feels more responsive and smoother as well.

And even the visual aspect of it is very different from better textures or whatever. You see more of items in motion, catch more details, and the smooth look of everything is awesome for totally different reasons than merely cranking the graphics up to 11.

To the OP, I don't think anyone can answer this for you. Hell I've used both a lot and I can't really tell which I prefer. I'll say this though, 144hz is a lot better than you might think IF you have the horsepower to push it. Trouble is with the new generation of engines and stuff coming around, pushing those framerates is going to be hard for a couple of years most likely.

There are certainly some games that are clearly cut: ArmA 3 would be all resolution as I'd never get 144hz (or much more than 60 at all) playing on a massive 20km map with 10km view distance while running a full simulation of radio systems, ballistics, flight dynamics, etc. The resolution though would give me more detail at greater ranges allowing me to identify threats earlier and further out, etc etc etc.

Team Fortress on the other hand is all framerate. The visual style and close quarters nature of the combat means there's no advantage to higher resolutions except slightly cleaner polygons. The game looks stunning at 144hz though and it's so damn responsive it's crazy.

It's really a personal taste thing though. If you can, try them both. Consider which would probably be most useful in teh games you play. Etc etc etc.

#7 Posted by rm082e (205 posts) -

@mb said:

@rm082e said:

If you're not looking to become a high level FPS player, then 144Hz is just a visual effect. The added sensory input data isn't going to make any game better (that isn't twitch based) since you don't need pinpoint accuracy to play against the AI.

As someone who actually owns a 1080p/144hz monitor, I can absolutely tell the difference and it's much more than just a "visual effect". Higher FPS makes gameplay more responsive, reduces input lag, and provides a smoother experience. I can absolutely tell the difference between 120 & 60 fps, not so much between 120 and 144. I use my 1440p monitor when I'm doing anything else other than gaming.

To me there is much more benefit in gaming to go from 60fps to 120fps as compared to the benefit of going from 1080p to 1440p and staying at 60hz. Of course it's all personal preference, but having played games for the last few months at 120fps, I find it jarring to go back down to 60 or less.

So would you say the reduced input lag and "smoother experience"give you any sort of tactical advantage against the computer AI in single player games? If so, can you describe that advantage?

#8 Edited by rm082e (205 posts) -

@jarmahead said:

@rm082e said:

...

I disagree, 144hz feels completely different. Have you actually used a 144hz screen for a long period of time? It may not be as "essential" to success, but it will ALWAYS feel a lot better than 60. It doesn't just look nicer, it feels more responsive and smoother as well.

And even the visual aspect of it is very different from better textures or whatever. You see more of items in motion, catch more details, and the smooth look of everything is awesome for totally different reasons than merely cranking the graphics up to 11.

No, I have not spent much time playing at 120fps. I played about 15 minutes of the Battlefield campaign on my brother's machine and I didn't find it gave me any tactical advantage. It is very different - I'm not saying it isn't. My point was that if you're playing single player games like Tomb Raider and The Witcher, the added frames are not going to give you a significant tactical advantage the same way they do in competitive multiplayer. This is solely because the games AI is never going to require that level of precision from the player.

You guys are right in that this is a personal preference issue, with high frame rates on one hand and maxed out graphics on the other. If you prefer the higher frame rates, have at it. I'm not saying you're "wrong". I just think maxed out graphics are more desirable in games where higher frame rates do not provide a tactical advantage.

#9 Posted by MormonWarrior (2640 posts) -

I have this. It's awesome.

http://www.amazon.com/dp/B00B2HH7G0/?tag=pcpapi-20

Also do the following for the best color profile:

1) Download the appropriate ICC profile below and save it to a suitable place -

AMD GPU users

Nvidia GPU users

2) Set the monitor to ‘Standard Mode’ at 144Hz. The following settings were used to create the profiles but feel free to adjust if necessary -

Splendid= Standard Mode

Brightness= 24 (gave 160 cd/m2 on our unit, adjust as required)

Contrast= 75

Color Temp= User Mode

Red= 100

Green= 90

Blue= 89

3) Follow these instructions on how to activate the ICC profile. In that article you’ll also find a link to download a useful and very small utility called ‘Display Profile’ which you can use to toggle between ICC profiles. This is useful if you want to switch to default color settings (essentially no ICC profile active) when running certain applications (games etc.) that don’t use the profiles properly.

This edit will also create new pages on Giant Bomb for:

Beware, you are proposing to add brand new pages to the wiki along with your edits. Make sure this is what you intended. This will likely increase the time it takes for your changes to go live.

Comment and Save

Until you earn 1000 points all your submissions need to be vetted by other Giant Bomb users. This process takes no more than a few hours and we'll send you an email once approved.