I don't mind low fps, i notice it at about 20 or 15 and it bothers me but most of the time whatever 30 is fine. Sometimes i do get that point where im looking at the counter like really this is 30 fps feels 20. Other times buttery smooth 60 feels about how i feel at 45. Anything high like 100 i dont have the luxury of watching yet lol. So overall what are your opinions.
All games should be 60fps+ as a default or they should not release until they are in 2024
Consoles are always going to be a balancing act. I think speaking in absolutes like "every game should be 60 FPS on console" is limiting to developers. They should be free to use the horsepower available to them however they think is best. Now, I think every game should hit its FPS target and maintain it without dropping regularly, but dictating that every game should reach 60 may prevent developers from doing other cool or interesting things with their game that may not be possible at 60 FPS on a console. That is especially true when talking about severely power limited machines like the Switch.
Of course, on PC there should be no frame rate cap other than that what makes sense for each user's hardware, but games should go as high someone's PC can achieve.
...casting aside that framerate tolerance and perception is a variable, personal thing-
it really doesn't bother me as long as 30 or so is the floor and remains consistent. i've never been super performance-minded or gotten into tuner culture despite custom pc builds, and the only 60+hz screen i own is a midrange TCL tv i just recently purchased due to my previous screen kicking the bucket.
it's nice to have for some genres, and a steady 60fps suggests a certain amount of stability and optimization.
but right now i'm playing BG3 on PS5 with the locked 30 setting, and the loss of frames is pretty negligible to me in a turn-based combat game. and i'm much happier to be playing this port now than asking Larian to burn more cash for a feature that is personally superfluous to me.
I fine with 30 fps. The only way I've ever been able to tell the difference between 30 and 60 is by flipping back and forth between the two if a game provides the option. I can tell when something dips below 30, but if a game is locked at 30 and doesn't tell me what it's running at, I wouldn't know the difference from if it was locked at 60.
I’ve become a 60fps convert. It didn’t used to bother me too much, but in the past few years if literally makes my eyes go wonky if things are at 30fps. Bloodborne was the first example I noticed this with, I played and loved it on PS4 back in 2017 or whenever it was, but trying to play it again recently, my eyes start to tremble and water when I spin the camera around. This is less an issue for thinks like puzzle games or games that lots of constant hud elements, but 3d games it 360 camera control are the worst offenders. But at 60fps, no problem. I’ve run tests on other games, particularly PC games that offer frame caps, and 30fps is consistently harder to watch than 60fps.
I don’t want to be a snob, 60fps isn’t needed in all cases, and I dunno if it should be a forced standard, but I certainly wouldn’t complain if it were.
I'm very sensitive to frame rate disruption, low frame rates and input latency, so i need as many frames as possible.
Most games that don't require extremely precise inputs like first person shooters can run at 60fps with vsync i don't mind that, but anything lower than 60 is broken to me, that's one of the reasons why i went PC only.
It blows my mind people who don't demand 60 or are fine with 30, i would trade every graphical advancement in the last 15 years for high frame rate.
I would tell you that framerate is very important, and that having everything on every device run at 60FPS should be a goal that developers and publishers alike should strive for. Personally, I can play and enjoy a game running at 30FPS, but it's always a jarring transition and I never appreciate it. I would rather play at 60+ all the time, every time, and that framerate is a major reason I keep my PC somewhat up to date.
...but here's the thing. Most people don't notice. Over the past few years a lot of console games have had pretty good 60FPS modes and I bet that did "convert" some people to the 60FPS fandom, and some of those people might go on to get into PC gaming just to keep 60FPS in every game. But the vast majority of people who wanted to play Starfield at 60FPS are going to grumble, maybe rage online a bit, then download and play Starfield for a hundred hours on their Xbox.
This isn't just about optimization, either. 60FPS costs more resources. As in, the Xbox Series X and PS5 have a limited amount of resources that any given game can draw on. You can use those resources more efficiently only to an extent. At the end of the day some games have too much going on or are too complex to run at 60FPS on those consoles. At that point, you either sacrifice something significant to the game or you cut down to the 30FPS cap that most people are going to be fine with.
Also, when I say "too much going on" and "too complex", I'm not just talking about graphics and resolution. Sure, you can cut down resolutions and graphical effects and such and make a game run better, but again, sometimes there's stuff going on under the hood that you just don't think about. Baldur's Gate 3's Act 3 is a prime example - the city doesn't look any better than the previous two acts, but there are many, many more moving parts. There are so many items, NPCs, and such that have a physical presence in the world, and that adds a lot to the resource cost for that game in a way Larian's engine doesn't seem to be very good at handling.
And, again, most people who play games on console have proven time and time and time again that even in the presence of 60FPS games, they'll go buy games with 30FPS caps and barely even notice. On the PS2, Jak 3 was a 60FPS game. So was Ratchet and Clank. Yet Grand Theft Auto sold a zillion copies. On the 360, Call of Duty was a 60FPS game, but Halo 3 still sold a zillion copies, along with a bunch of other games. The comparison to 60 has always been there on consoles and it's always been a good selling point but it has never once knee-capped a game's sales. On the other hand, visual splendor can move copies of games and get people to really pay attention. Think about Star Wars 1313 and how bummed people were that it got canceled. It looked like "Star Wars but Uncharted" as far as gameplay goes but visually it was amazing for the time.
We won't see 30FPS games disappear in favor of 60FPS ones until one of two things happens - the hardware far outpaces what developers can actually do with it, or until consumers stop buying 30FPS games altogether and vocally announce that they're doing this because of 30FPS caps. It doesn't seem like either of these are going to happen anytime soon.
Maybe I'm old, but I really can't tell the difference between 30 FPS & 60 FPS. I had someone show me the difference in still frames and realized that it basically only matters in online competitive games. Even then, you have to have good enough internet on both ends for it to matter.
My theory is that people are training themselves on these higher FPS and it's messing up their minds. It's like listening to podcasts at 2x+ speed and then feeling like everyone around you talks really slow.
30FPS doesn't bother me when its smooth and stable without any frame pacing issues. The Resident Evil 4 Remake is a perfect example of a game that has a great 4K 30FPS mode that runs so smoothly its almost hard to tell the difference between it and 60FPS performance mode even though I still can. Sadly, a lot of games that have a "quality" 30fps mode doesn't run that smoothly. Jedi Survivor's 30fps mode was also surprisingly good, but I played it on a PS5, not the PC where most of the problems seem to be. And I find people who claim they can't tell the difference are this way because they're so used to playing console games which most are locked at 30FPS. Whereas people like myself and others who experience both PC and consoles see the difference a mile away.
table frame rate. That all I want, a stable 24 frames, 30 frames, or 60 frames that are stable. We have this wonderful slab of organic matter in our skulls that does a great job interpolating movement, as long as animation is consistent and clear.
The caveat is of course the DEVICE you are watching on. The best result is always the media nd the device matching at some divisible rate. Since modern monitors have "prefered/native" synch rate that is typically best. In a movie theater with an old projector (not digital) a 24 fps film will look best going through that machine. On a modern movie projects that is digital the best rate is what is NATIVE for that projector. On a TV at home again native resolution with a native sync rate is best.
So setting a MUST have rate is pointless if you have a 120 Hz or 240 Hz monitor, because the you are just having a CPU speed up or slow down. We live in a world where people have 60Hz, 75Hz, 120Hz, 144Hz, and 240Hz. refresh rates so demanding a certain fps is pointless - you are still making a CPU fix it for you - speed it up form some people slow it down for others.
Stability is all thats really important to me. I find switching from one to the other to be jarring but once I get over it I dont notice until I have to switch back.
I think I do prefer 60fps obviously but I always go for the graphics mode when its presented and just feel kind of bad about having to make the choice. I'd rather a videogame scientist make that choice for me and I just plug and play.
Please Log In to post.