Is 720 60 good enough?

  • 89 results
  • 1
  • 2
  • 3
Avatar image for indeedcodybrown
IndeedCodyBrown

633

Forum Posts

89

Wiki Points

4

Followers

Reviews: 1

User Lists: 1

Edited By IndeedCodyBrown

Poll Is 720 60 good enough? (435 votes)

Yes 33%
No, I need at least 1080 30 14%
No, I need at least 1080 60 46%
No, I need at least 4k because I am a being of the future. Kneel before me. 3%
What? I don't play video games where am I? 4%

I feel weird because recent talk of resolution and framerate seems a bit unnecessary to me. I would be happy still with games that run at 720 60 and having other hardware power going into more complex systems and gameplay mechanics. Is this just me?

 • 
Avatar image for mike
mike

18011

Forum Posts

23067

Wiki Points

0

Followers

Reviews: -1

User Lists: 6

@mambogator: He's not talking about films, he's talking about video games. And yes, I am talking about playing games at 144 frames per second on a 144 hertz monitor.

I think all of us who own high refresh rate monitors are well aware of the difference between refresh rate and render rate, but thanks for your post. What kind of monitor do you have?

Avatar image for mike
mike

18011

Forum Posts

23067

Wiki Points

0

Followers

Reviews: -1

User Lists: 6

@mambogator: I'm not talking about console games, I was directly responding to your original comment. Anyway my original statement stands, no need to get pedantic about it.

Avatar image for the_last_starfighter
The_Last_Starfighter

510

Forum Posts

481

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

I know this is a console specific question but on the PC side I've found 2k (2560x1440) to be the sweet spot for quality vs optimisation.

On the console side I've never had any issue with 30fps, I would def want that running at 1080 though.

Avatar image for mike
mike

18011

Forum Posts

23067

Wiki Points

0

Followers

Reviews: -1

User Lists: 6

@mambogator: I do have the hardware and can tell the difference. At least two other people in this topic alone who have the hardware say they can also tell the difference.

Hopefully one day you can experience this for yourself instead of relying on studies.

Avatar image for shivoa
Shivoa

1602

Forum Posts

334

Wiki Points

0

Followers

Reviews: 1

User Lists: 6

#58  Edited By Shivoa

Erm, seeing flickering of lights as a solid illumination has absolutely nothing to do with being able to tell motion cues. If that's the quality of the research your citing then anecdotal is actually going to win this battle.

Also 48 flashes of light per second is 48 lit frames and 48 dark frames being distinguished so you just cited a study saying people can see 96Hz as distinct frames, ie 60Hz is not enough.

Avatar image for jaycrockett
jaycrockett

873

Forum Posts

80

Wiki Points

0

Followers

Reviews: 13

User Lists: 5

I just started Gears 4 on PC and was thinking about this. I was surprised how clear it looked just at 1080, and remembered that most 360 games were far off 1080, a lot of times even lower than 720.

My PC isn't great, so I locked the frame rate at 30 so I could boost the quality. When I get the chance, I think I'll try lowering the resolution to 720 to see if I can get it locked at 60, and see what that feels like. I didn't think I'd need 60 for Gears since I'm sure all the others weren't. But maybe I've just gotten used to higher framerates on PC.

Avatar image for moab
MOAB

626

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#60  Edited By MOAB

I have 144hz monitor and I don't understand how anyone could not see a difference. Even my grandmother could tell the difference when moving a folder across the screen really fast.

@jaycrockett said:

When I get the chance, I think I'll try lowering the resolution to 720 to see if I can get it locked at 60, and see what that feels like.

Try using lumasharpen in Reshade if 720p looks blurry. Lumasharpen is amazing.

Avatar image for mike
mike

18011

Forum Posts

23067

Wiki Points

0

Followers

Reviews: -1

User Lists: 6

@mambogator: We can continue this discussion once you've experienced high frame rate displays for yourself. There is a difference and it's not a subtle one either. The difference is even immediately obvious when doing something simple like moving a window around on the desktop like gundogan mentioned.

Anyway, as someone who owns this hardware, I'm saying there is a noticeable difference. That's really all I have to say on the subject, if you don't believe it's noticeable and don't want to see it for yourself then that's your prerogative I guess.

Avatar image for peacebrother
peacebrother

766

Forum Posts

311

Wiki Points

0

Followers

Reviews: 0

User Lists: 17

#63  Edited By peacebrother

You'd have to be blind to not see the difference between 60 and 144. I can feel a difference even when it jumps to 75/80. It is blindingly obvious what the difference is when it's in front of you.

It's as clear as day the difference in fluidty, smoothness, and clarity. After 120 it's a bit harder to notice, but you can still definitely feel and see it.

Hell, even moving my mouse around the desktop shows a drastic increase in smoothness of motion.

Avatar image for zirilius
Zirilius

1700

Forum Posts

49

Wiki Points

0

Followers

Reviews: 1

User Lists: 2

1080p60 is the ideal minimum, but I'd take 720p60 over 1080p30 any day.

This in a heartbeat. New standard for that is 900p60 over 720 but I'd much rather framerate be there then resolution.

Avatar image for shivoa
Shivoa

1602

Forum Posts

334

Wiki Points

0

Followers

Reviews: 1

User Lists: 6

#66  Edited By Shivoa

@mambogator: Sorry, you're making yourself sound even less capable of talking about this topic. Please just stop digging (everyone who has seen a 144Hz screen in this thread is already laughing at you).

"1 hertz is one full cycle (in the case of screen flickering, that is the light-dark cycle). So, no, 48 light "frames" and 48 dark "frames" is not 96 Hz"

You really don't get it. "In laboratory testing, it has been found that the human sense of sight can distinguish up to 48 flashes of light per second" mean there were 48 discrete periods of white. If you display a 48Hz signal which are all white frames on an LCD then do you know what it looks like (say on a 2000fps high speed camera)? Just white. The liquid crystals are all held open permanently and you just stare into the backlight's white. To get an LCD to display 48 discrete flashes of white light would require a 96Hz signal that interleaves white and black frames. That's just how it is. That's the truth and your confusion to what is going on may indicate how you have come unstuck from the discernible reality everyone in this thread has seen with their own two eyes (and why you apparently are quoting sources that contradict your assertion and then saying the issue is the author you quote doesn't understand frames and Hz).

Which is more likely: all the sources you're reading are from people who don't get the difference and everyone else in this thread who has a history of posting on technical topics and has first hand experience of 144Hz screens (which do absolutely take 144 frames a second of input from the PCs driving them) also fails to grasp this OR it is you who has become confused about this? I would advise reading up on tech like FreeSync or G-Sync as understanding how displays and computers work together is covered by how those technologies modify the expected behaviour to provide variable refresh rates and so scan-out frames as soon as they're rendered rather than at a fixed cadence.

Avatar image for snowypliskin
SnowyPliskin

193

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#67  Edited By SnowyPliskin

How about some good games first.

Avatar image for deactivated-5a00c029ab7c1
deactivated-5a00c029ab7c1

1777

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 1

1080p 60fps it's 2016 we shouldn't even be having these discussions but Sony and Microsoft decided to release under powered consoles and it's holding devs and gaming back Aka WatchDogs, Witcher 3, the Division all downgraded because Sony and MS where to cheap in 2013 to release a decent spec machine it was already outdated by 3 years when they came out. That's why these conversations about specs are brought up these day compared to last gen when the consoles where more on par with PC's back then as much as they could be at least.

Avatar image for shivoa
Shivoa

1602

Forum Posts

334

Wiki Points

0

Followers

Reviews: 1

User Lists: 6

#70  Edited By Shivoa

@mambogator: Dude, you're throwing out flat out wrong statements in a thread derailing everyone else having a decent discussion or few in here and then claiming to back your position up with science and then quoting people who directly contradict your assertions. Snark takes skill and actually being right because you have a strong understanding of the area of discussion. Without that, you're just acting the clown.

I'm not angry or upset, I'm gobsmacked you're still pretending you know what you're talking about.

Avatar image for fredchuckdave
Fredchuckdave

10824

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 2

#71  Edited By Fredchuckdave

Graphics are rarely decided purely on framerate or resolution, art style matters far more; except in the rare case where the game just looks way better than everything else on the market (Crysis, The Witcher 2). Metroid Prime still looks better than the vast majority of games out there just because of the art design. That said a game will always look better fully optimized for 30 FPS than for 60 just due to the nature of things, however if you want the game to play smoother/better etc. then you'll prefer 60 naturally; though for most games this matters almost not at all.

This is one of those totally disinteresting topics I find, almost anything you could possibly broach would provide more compelling arguments from any party; in fact I imagine it is solely extant because of the drive to always buy newer/better PCs to run theoretical games that don't exist because games are almost never developed for that audience and in the future will NEVER be developed for that audience because the cost of creating games is going up exponentially to the point where it just isn't logistically possible to make games look better except at an extraordinary financial loss. Enter CD Projekt Red to shit all over my argument, bastards.

Avatar image for zevvion
Zevvion

5965

Forum Posts

1240

Wiki Points

0

Followers

Reviews: 6

User Lists: 2

#73  Edited By Zevvion

The way you formulated your question showcases you either don't have a powerful PC, or you play on console: all the time. I am not an elitist. I actually spend most of my time on a console. I play where the games are at that I want to play. But if you've played 1080 of even 4K at 60-120fps regularly, you really don't want anything else unless there is no other possible way. And you said:

I would be happy still with games that run at 720 60 and having other hardware power going into more complex systems and gameplay mechanics.

That's just it though. If you have a good PC, you don't need to make a choice there. You can have the most complex physics systems and gameplay mechanics and still run at 1080 or 4K 60-120fps. I don't think you fully realize this isn't an ultimatum. You're essentially asking if you want great resolution and high fps, or low resolution and low fps. There is no benefit to being lower tier from an experience perspective. Only financially. And that not even necessarily.

@mambogator I think you're making a mistake by assuming lack of knowledge on the people you quoted. As their posts read to me they are fully aware of everything you just explained. I'm pretty sure they just try to say that when a game hits 120+fps you need a 144hz monitor to actually see it, as a monitor that refreshes itself only 60 times per second will render more than 60fps from a game's performance redundant.

Avatar image for shivoa
Shivoa

1602

Forum Posts

334

Wiki Points

0

Followers

Reviews: 1

User Lists: 6

#75  Edited By Shivoa

@mambogator: You quoted someone explicitly talking about 48 distinct white flashes, ie exactly the thing that would require a 96Hz LCD screen being fed a 96 frame per second input to replicate. And this was considered to be a test in which each frame transition was visibly distinct. So your own quote definitively says 60Hz is not enough.

You jumped into this thread saying "your brain can't differentiate between anything much higher than that [60fps]". This is categorically wrong, in fact it's trivial to demonstrate (just have a high refresh rate screen next to a 60Hz screen and use a Mk 1 eyeball). You have in fact quoted only people who agree that this is an incorrect statement.

We are talking PC LCDs screens that say 144Hz on the sticker and can typically refresh the panel at various rates up to that maximum. They also take a signal with up to 144 frames per second. If they are FreeSync or G-Sync then they can actually accept and display frames at a variable rate, otherwise they are at a fixed cadence that the handshake determined. When not using a variable refresh rate then that handshaken refresh is the frequency that everything is fixed to. So if I set my monitor here to 1080p100 then the panel refreshes at 100Hz, the DVI cable between it and the PC has exactly 100 frames a second send down it, and the GPU driving it scans-out 100 frames for that screen to display. If playing something like a game then the GPU may not have rendered a new perspective to a buffer and so when the scan-out time comes it'll scan-out the same frame it previously rendered and sent to the screen. This repeated frame would be where we say the game's framerate has dropped below that of the screen's frequency.

But none of that detail of how actual screens work has anything to do with your weird assertion that people can't above 60Hz. It certainly has nothing to do with the way you've quoted stuff that explicitly agrees that people can see things above those frequencies (like the example I started with of what refresh rate [and identical framerate to feed new black or white frames at each scan-out] would be required to display 48 different white flashes in a second on the type of monitors we are discussing).

Avatar image for zelyre
Zelyre

2022

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 1

I remember loving the crap out of Quake 3 at 1600x1200 at 100fps on a Geforce 3 on my Viewsonic CRT in 2001. And degausing that CRT.

I'll take resolution and frame rate and give up things like tress-fx or phys-x water. Native Resolution, frame rate, and I will start ticking off graphical settings to get there. If you have phys-x water or hair, those are the first things that get sacrificed to the frame rate gods. Once you get used to 60 or higher? Years ago, I installed an ENB package and I thought it had tanked my Skyrim framerate into the teens. Nope. Pretty much a solid 30-40.

You won't see many games with dumbed down graphics in exchange for framerate (Unless it's VR). You can market pretty graphics much easier than you can market frame rates or realistically modeled car physics. People look at a super pretty linear hallway shooter game that has baked lighting, baked physics, and then expect another game of a totally different genre, with real time lighting and physics to look just as good.

Avatar image for shivoa
Shivoa

1602

Forum Posts

334

Wiki Points

0

Followers

Reviews: 1

User Lists: 6

#79  Edited By Shivoa

@mambogator: So... you're conflating transition time (often measured in GtG - grey to grey transitions because overdriving actually made the original measure which did black to white back to black measurements faster than the smaller intensity changes) with the refresh rate of the panel or the signal? So you think the refresh rate of the panel or the input signal is key to blurring due to pixel transition times and so think people are just seeing blur from "the high refresh rate" (actually a function of pixel transition times) and not actually the higher frame rate itself that drives the additional images of a mouse in more discrete location on the screen (providing a smoother visual representation of the mouse and a lower latency due to increased scan-out which mean the mouse position on screen is slightly closer to the current position)?

Avatar image for oursin_360
OurSin_360

6675

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

@mike: If you profess to be able to tell the difference between 60 and 144 frames per second (and if you have hardware that can run modern games at that rate, kudos) I can't disagree with you. But every study I've read on the subject over the years determined that the average person has difficulty differentiating between animations running at higher than around 60 FPS (give or take roughly a dozen by some studies. I've seen as low as 48 and as high as 74). Recently some are claiming humans can differentiate up to about 150 FPS but I checked the cited source of these claims and it doesn't actually say that. The actual claim is:

In laboratory testing, it has been found that the human sense of sight can distinguish up to 48 flashes of light per second. [...] The flickering of lamps operating on alternating current at 50 cycles per second is not visible to the human eye

(source)

I think film and video games should be viewed in 2 different lights as one being interactive seems to have a difference on how your brain perceives movement and fluidity of frames. I've never seen anything above 60 personally, but i used to couldn't tell the difference between 60 and 30 and now going back to 30 is jarring at first and any drop in frames below 50 is noticeable to me. So the perception of frames might be tied to more than just sight due to the control interactions, I'm not sure if any tests have been done on video games and interactive media?

Avatar image for zevvion
Zevvion

5965

Forum Posts

1240

Wiki Points

0

Followers

Reviews: 6

User Lists: 2

#81  Edited By Zevvion

@mambogator: For what it's worth: I have a 144hz monitor and frequently play games around 60fps on it without any noticeable screen tearing. Additionally, there isn't an actual established cap on human perception of frames. All there is are observational studies that prove nothing beyond what a certain group of people could perceive. This cannot be used as evidence for what anyone can see. I can name some arguments to support that. First, the studies that were first conducted showed no higher than around 30fps of distinction. At the time, people knew that was nonsense as they could clearly see the difference between 30 and 60fps. Newer studies indeed showcase much higher perception of fps to 70-150fps depending on the study. Which means either humans have somehow adapted in the years to perceive more fps, or it's just an observational study and nothing more. Even today you can find studies that claim a ~40fps cap, while others claim 150. Secondly, you can compare this to taking people in for a physical test and then using that information to determine the cap of human performance. If you take in general populace (which is what these studies do), it should be clear to you there is no way in hell the results are evidence of human limits. All it proves is what the performance of those individuals were. Perception of frame rate is not that different.

Heck, I had a date who couldn't tell the difference between 20 and 60fps. I can clearly see a difference between 30 and 60 or even 60 and 80. You can't use inconclusive studies to say I can't. It doesn't work like that.

Avatar image for shivoa
Shivoa

1602

Forum Posts

334

Wiki Points

0

Followers

Reviews: 1

User Lists: 6

#83  Edited By Shivoa

We're talking computer generated images.

Everyone on here who has been playing a decade of games has been using their brain to convert frozen instants with (until very recently no) little or inaccurate motion blur for motion cues into moving pictures unlike how they see the real world (which, the studies seem to show cannot be conceptualised as recording "frames" as we have a continuously open shutter) or how they see recorded live-action footage (where each frame contains the integral of 1/48th of a second of open frame, followed by a 1/48th of a second of no recording, and then open again, for the classic 180 degree shutter that is the tradition for modern cinema).

Our brains are able to adapt to a lot so "X is unusual and causes sickness" studies really only work when selecting a population who aren't used to X, which we here are used to. Every person who should wear glasses and doesn't is currently training their brain to work with a broken view of the world. We are good at adapting and high framerates are not making us sick. PS VR doesn't make you sick because it's too fast (at 120Hz), which is something that assertion would conclude, but because of simulator sickness.

Again, this is all stuff about non-interactive scenes rather than something where we have input and expect it to instantly be represented on the screen. Motion-to-photons as they say in VR. That's the precise responsiveness that people find incredibly easy to notice just by moving a (decent, high rate) mouse around on a desktop with a 100-144Hz monitor.

Avatar image for bradbrains
BradBrains

2277

Forum Posts

583

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

"good enough" is a fine term for it though I think 1080p and and 30fps should be the minimal for hd gaming at this point

Avatar image for meatball
MEATBALL

4235

Forum Posts

790

Wiki Points

0

Followers

Reviews: 0

User Lists: 10

1080p certainly looks clearer on my TV than 720p does, but I sit far enough away that the lower quality of 720p doesn't particularly bother me when playing games. If I had the choice between 1080p and 60fps I'd take 60fps every time.

Avatar image for ntm
NTM

12222

Forum Posts

38

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#86  Edited By NTM

First time using my phone on Giant Bomb (or commenting on any site on my phone for that matter)... Anyways, the only reason one would be okay with 720p is if they don't care to have a bigger TV which has 1080p or above. I used to, just a few years ago, use a 19" HDTV, that was perfectly fine at the time, but getting a 40" HDTV, I jumped to 1080p. As for fps, if the game has a steady thirty at 1080p, that's fine by me. Sixty is always great, but not necessary for newer games (I'd rather have higher quality visuals with a steady fps than a less pleasing visual presentation at sixty.) I'm currently perfectly fine with 1080p, and don't need to jump to 4K.

The bigger the TV, the higher the resolution I think. Inconsistent frame rates is the issue. The more they can do with the visuals, not just in terms of detail and effects, but as well as animations to bring a more immersive experience and what have you is more important to me than 4K, or striving for an always 60fps. There needs to be boundaries though, because they shouldn't ever make it so the game is just too impressive for the system to handle it frame rate wise. Something that's important for all TV's I think is for the viewer to sit far/close enough/to from the TV, as well as trying to calibrate it.

Avatar image for colourful_hippie
colourful_hippie

6335

Forum Posts

8

Wiki Points

0

Followers

Reviews: 0

User Lists: 2

I've wanted 1080p/60fps videos years ago

Avatar image for mamba219
Mamba219

317

Forum Posts

11494

Wiki Points

0

Followers

Reviews: 0

User Lists: 4

SD is fine.

Avatar image for aquageneral
AquaGeneral

170

Forum Posts

1686

Wiki Points

0

Followers

Reviews: 3

User Lists: 8

I'd say 720p @60fps is definetely "good enough". It actually requires more power to do 1080p at 30fps (~62 million pixels/second) vs 720p at 60fps (~55 million pixels/second). I think it really depends on the game for if I would prefer 1080p 30 vs 720p 60.

Avatar image for brendan
Brendan

9414

Forum Posts

533

Wiki Points

0

Followers

Reviews: 0

User Lists: 7

Until I get a TV with a screen size bigger than 55 inches (65 inches at least) I won't care much about gaming above 1080p 60fps. On the computer I'm building I'll be going 1440p at least 30fps, or 1080p 60 if possible.

Avatar image for wesleywyndam
WesleyWyndam

230

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 2

I have a 144hz monitor and I've noticed that at I can easily see the difference in frames until around 90-100. Above that and it gets difficult to notice a difference.

Avatar image for winsord
winsord

1642

Forum Posts

86

Wiki Points

0

Followers

Reviews: 0

User Lists: 13

I would take fewer graphical effects and at least 900p 60FPS, ideally. I would still take 720@60 over 1080@30 though; anytime I see games running in 60FPS or higher, I always feel a big sigh of relief. 30 isn't unbearable by any means, and I'll take a stable 30 over variable 40-60 in most cases, but 60 looks and feels so much better that it's hard to ignore.

Avatar image for dstopia
dstopia

369

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

I wouldn't mind 720p/60 if the native resolution of the TV was 720p. However, most TVs/monitors nowadays are 1080p, so 720 just looks fucking awful on that.

So 1080p/60 it is. Barring that, 1080p/30, but that's a bummer.

Avatar image for jaycrockett
jaycrockett

873

Forum Posts

80

Wiki Points

0

Followers

Reviews: 13

User Lists: 5

So I tried Gears at every resolution between 1080 and 720, but I did need to take to 720 to get a locked 60 frames per second with high settings. And boy did it look bad at 720, at least compared to 1080.

I definitely can tell the difference between 30 and 60 fps, but it wasn't worth the drop in visuals.

So guess for me, with Gears 4, no, 720 60 isn't good enough.

Avatar image for oursin_360
OurSin_360

6675

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

@jaycrockett: You could try and mess with your scaling options, not sure if AMD cards have them but nvidia does. Could be changing to gpu or display scaling could give you a better result, obviously native resolution would look better either way.

Avatar image for tpoppapuff
TPoppaPuff

522

Forum Posts

6

Wiki Points

0

Followers

Reviews: 5

User Lists: 1

#96  Edited By TPoppaPuff

The problem is more complex systems and gameplay mechanics are almost always more strenuous on the CPU, not the GPU. Even games like Assassin's Creed among others have more framedrops on the PS4 than the Xbox One despite the PS4 having 1.5x the GPU power. So more complex gameplay systems like more complex AI are not going to lead to 60 fps games at 720p. More complex game systems are going to lead to 30 fps games no matter what the resolution is.

Now if we're talking PC games and not consoles, then who cares? Just throw more money at it.

As for what I prefer, I want nice visuals at a nice resolution even if it means sticking to 30 fps in single player games. In competitive multiplayer games, all titles should run at 60 fps. If it's skill based it needs to be 60 fps. If it's VR it should be 90 fps or 120 fps, but that also includes PSVR as it "cheats" to get double the rate on 60 fps games. For a game on a screen not an inch from your eyes, 60 fps is fine. More than that is simply the posturing of "ePros."

Avatar image for theanticitizen
theanticitizen

426

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 1

#97  Edited By theanticitizen

720/60 is...okay on console. The upscalers on them and most engines works out a lot of bad pixel shimmer(eg look at BF4 on X1 compared to BF1, the tweaked engine helped with aliasing on rough edges) but I can't play games on PC on anything under the resolution my monitor is at. Which sucks for me because my PC is not very well optimized and most games run between 45-60 FPS on it :/

Avatar image for clagnaught
clagnaught

2520

Forum Posts

413

Wiki Points

0

Followers

Reviews: 1

User Lists: 19

#98  Edited By clagnaught

720p was fine on the PS3 for me. This generation everything should be in 1080p. On my PC, I would like my games to run at 60 FPS, but I'm also Ok with 30.

My ideal for everything is 1080p, 60 FPS. Decent 4K gaming still feels like it is like 3-4 years off, but is here today because companies need to sell new monitors and TVs.

Avatar image for rocketblast0063
rocketblast0063

324

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

I would say yes. I would also say that I would prefer 720@60 fps over 1080@30 fps. A low fps is less forgiving and easier to notice at a higher resolution so in a way, in theory I could prefer 720 over 1080 at 30 fps. And that's just the visual part of it. In practice and on a PC I play at my monitor's native resolution.

Input
When it comes to action games the difference between 30 and 60 (and for that matter 60 and 120) can be huge since inputs are validated on each frame. On a PC this is easy to test in many games where you can lock the FPS to compare. The Hz of the monitor doesn't affect this btw, the Hz is about the representation of the picture, not the underlying game engine.

Movies
It's a different story since you don't have inputs or motion that relies on the viewer. It also has a lot of motion blur, generally speaking that is. A 24 or 30 fps video can stutter a lot, especially at 1080 or higher during panning scenes.

Avatar image for zevvion
Zevvion

5965

Forum Posts

1240

Wiki Points

0

Followers

Reviews: 6

User Lists: 2

@mambogator: The issue is either you or someone else claimed being able to perceive beyond 60fps is impossible for the human eye and used research to back it up. I can absolutely, 100%, dismiss scientific studies. Just because something is called a 'scientific study' doesn't mean it's accurate, correctly interpreted, measured or anything else. If you have a job that requires you to handle based on scientific evidence you'd know that at least half of the studies you'll see are inconclusive and another half of the remainder are incorrectly interpreted.

In this case, the studies cited that claim humans can't perceive fps difference beyond 60fps is unsurprisingly completely false. The difference between 60fps and 120fps is massive and there aren't any professional gamers that don't notice the difference, never mind the fact that someone tries to claim it's a human limitation. This is also why you can find studies that claim people can distinguish up to 150fps or even higher.

Common sense needs to be applied when reading studies, don't just take a study as fact simply because it is a study. It's not holy, it's meant to be challenged and doubted. In this case rightfully so.